• Keine Ergebnisse gefunden

robots navigation

N/A
N/A
Protected

Academic year: 2021

Aktie "robots navigation"

Copied!
45
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

Vision-based systems for

autonomous driving and mobile robots navigation

LU K A S HÄ F L I G ER – S U P E RVISED BY MA R I A N G EORGE

(2)

LUKAS HÄFLIGER 2

(3)
(4)

LUKAS HÄFLIGER 4

(5)
(6)

Google Chauffeur

LUKAS HÄFLIGER 6

(7)

Motivation

Environments where humans can not operate

Great distances where manual control is not feasible

Regular tasks

Time saving

Improving safety

(8)

Introduction

◦ AGV – Autonomous Ground Vehicle

◦ AUV – Autonomous Underwater Vehicle

◦ UAV – Unmanned Aerial Vehicle

LUKAS HÄFLIGER 8

(9)

Mobile robot navigation

Mobile robot navigation

Autonomous driving

Indoor Outdoor

Map-based Map-

building Mapless Structured Unstructured

Approches Goals

(10)

Indoor – Map-based systems

◦ The robot is provided with a map

◦ Needs to localize itself within the map

LUKAS HÄFLIGER 10

(11)

Indoor – Map-based systems

◦ Robot needs to correct its trajectory if it does not match the calculated trajectory

http://www.cs.cmu.edu/

(12)

Indoor – Map-based systems

◦ The robot is provided with a map

◦ Needs to localize itself within the map

◦ Robot needs to correct its trajectory if it does not match the calculated trajectory

◦ Different approaches

Force fields

Occupancy grids

Landmark tracking

LUKAS HÄFLIGER 12

(13)

Prominent robot: FUZZY-NAV

[PAN1995]

(14)

LUKAS HÄFLIGER 14

Force field

(15)

Occupancy grid

(16)

Indoor – Map-building systems

◦ In a first step the robot explores the map until enough information is gathered

◦ In a second step the navigation is started using the autonomously generated map

◦ Different approaches:

Stereo 3D reconstruction

Occupancy grid

Topological representation (feasible alternative to occupancy grids)

LUKAS HÄFLIGER 16

(17)

Stereo 3D reconstruction

(18)

Topological representation

LUKAS HÄFLIGER 18

[THRUN1996]

(19)

Indoor – Mapless systems

◦ The robot is not provided with a map

◦ Needs to detect and drive around obstacles

◦ Needs to localize itself within the envirnonment

◦ Different approaches:

Optical Flow

Appearance-based

(20)

Optical Flow

LUKAS HÄFLIGER 20

[GUZEL2010]

(21)

Appearance based

◦ Based on stored image templates of a previous recording phase

◦ Robot selflocates and navigates using these templates

(22)

Mobile robot navigation

LUKAS HÄFLIGER 22

Mobile robot navigation

Autonomous driving

Indoor Outdoor

Map-based Map-

building Mapless Structured Unstructured

Approches Goals

(23)

Outdoor – structured environments

◦ Represents road following

Detect lines of the road and navigate robot accordingly

◦ Different approaches

Laser range finders

Machine learning

GPS

Obstacle maps

(24)

Meet STANLEY

LUKAS HÄFLIGER 24

(25)

[THRUN2006]

(26)

LUKAS HÄFLIGER 26

[THRUN2006]

(27)
(28)

Outdoor – unstructured environments

◦ Random exploration

Only needs reactive obstacle detection

◦ Mission-based exploration

The robot has a goal position

Robot needs to map the environment

Robot needs to localize itself in the map

◦ Different approaches

Stereo vision

Ladar

Visual odometry

LUKAS HÄFLIGER 28

(29)

Prominent example: Curiosity

(30)

Visual odometry

Incremental motion estimation by visual feature tracking

Select features

Match in 3D with stereo vision to get 3D coordinates

Solve for the motion between successive 3D coordinates

LUKAS HÄFLIGER 30

(31)

Ladar – Laser detection and

ranging

(32)

Autonomous driving

LUKAS HÄFLIGER 32

Mobile robot navigation

Autonomous driving

Indoor Outdoor

Map-based Map-

building Mapless Structured Unstructured

Approches Goals

(33)
(34)

Autonomous driving - goals

◦ Reliable pedestrian detection

◦ Detect and interpret road signs

◦ Detect obstacles (other cars, trees on the street,…)

◦ Follow the road in given borders

◦ React to street signals like red lights

◦ …

LUKAS HÄFLIGER 34

(35)

Approaches – Reliable pedestrian detection

◦ Stereo vision [CHOI2012]

◦ Predict pedestrian motions [BERGER2012]

◦ Shape recognition [FRANKE1998]

(36)

Approaches – Detect road signs

◦ Stereo vision [FRANKE1998]

◦ Detection based on shape, color and motion [FRANKE1998]

◦ MSRC [GALLEGUILLOS2010]

LUKAS HÄFLIGER 36

(37)

Approaches – Obstacle detection

Obstacle maps [CHOI2012]

[CHOI2012]

(38)

Approaches – Road following

◦ Follow the road in given borders

Dark-light-dark transitions [CHOI2012]

LUKAS HÄFLIGER 38

[CHOI2012]

(39)

Approaches – Street signals

◦ React to street signals like red lights

Camera-based [LEVINSON2011]

[LEVINSON2011]

(40)

Thank you for your attention

LUKAS HÄFLIGER 40

(41)

Image Reference

Slide 2: http://farm7.staticflickr.com/6087/6145774669_b855d4a0fa_o.jpg

Slide 3: http://persistentautonomy.com/wp-content/uploads/2013/12/DSC_1053.jpg

Slide 4: http://25.media.tumblr.com/0c2b1a9479dc09971df4d15f05cc77d5/tumblr_mpqtp1BtTa1rdiu71o2_1280.jpg

Slide 5: http://electronicdesign.com/site-files/electronicdesign.com/files/archive/electronicdesign.com/content/content/74282/74282_fig1-nasa-curiosity-landing.jpg

Slide 10: http://www.cs.cmu.edu/~maxim/img/mobplatforminautonav_2.PNG

Slide 11: http://www.cs.cmu.edu/

Slide 14: https://eris.liralab.it/wiki/D4C_Framework

Slide 15: http://www.emeraldinsight.com/content_images/fig/0490390507007.png

Slide 17: http://www.vis.uni-stuttgart.de/uploads/tx_visteaching/cv_teaser3_01.png

Slide 21: http://www.extremetech.com/extreme/115131-learn-how-to-program-a-self-driving-car-stanfords-ai-guru-says-he-can-teach-you-in-seven-weeks

(42)

Slide 29: http://f.blick.ch/img/incoming/origs2243351/4650486351-w980-h640/Curiosity.jpg

Slide 30: http://www.inrim.it/ar2006/ar/va_quattro1581.png

Slide 31: http://www.hizook.com/files/users/3/Velodyne_LaserRangeFinder_Lidar_Visualization.jpg

Slide 33: http://mindcater.com/wp-content/uploads/2013/08/bosch-dubai-Autonomous-Driving.jpg

Slide 35: http://opticalengineering.spiedigitallibrary.org/article.aspx?articleid=1158526

Slide 36: http://www.cse.buffalo.edu/~jcorso/r/semlabel/files/msrc-montage.png

LUKAS HÄFLIGER 42

(43)

STANLEY details

VW Tuareg

Drive-by-wire system by VW

7 Pentium M processors

4 Ladars

Radar system

Stereo vision camera pair

Monocular vision system

Data rates between 10Hz and 100Hz

(44)

Curiosity details

900kg

2.90m x 2.70m x 2.20m

Plutonium battery

RAD750 CPU up to 400MIPS

Multiple scientific instruments

Stereo 3D navigation with 8 cameras (4 as backup)

$2.5 billion

LUKAS HÄFLIGER 44

(45)

Google Chauffeur details

150’000$ Equipment

LIDAR

Referenzen

ÄHNLICHE DOKUMENTE

Hence two trips with the same object identifier should be concatenated if the time difference between the final point of the first trip and the initial point of the second trip is

Coronary revascularization, either by CABG or PTCA, has been proven unequivocally to improve survival in selected patients, especially in those with triple vessel disease and

Forestry income trajectory (F) and forest industry profit trajectory (I) obtained in a multiobjective dynamic linear programming model: y -desired refer- ence trajectories, 9

Both Vietnam and Australia have various convergent interests in promoting their bilateral relationship, ranging from mutual economic benefits and cooperation in

Of course most applications are sharable so the savings would not be as great if for example you had one user going directly into Fortune:Word for example

The software repairs in this release are described in the README file included on the tape. Create a directory, cd into it and extract the software from the

The release of the book ‘Toward 2014-2019: Strengthening Indonesia in a Changing World’, published by the Indonesian State Intelligence Agency (BIN), provides an

Planning in Configuration Space Based on the early works of Lozano-Pérez (1983), a fundamental concept is to plan the path in the configuration space C of a manipulator. This is done