Visual and laser sensory data fusion for outdoor robot localisation and navigation

Stefano Pagnottelli, Sergio Taraglio, Paolo Valigi, Andrea Zanela

Research output: Contribution to conferencePaper

12 Citations (Scopus)

Abstract

An architecture for robot localization and navigation performing fusion among odometry, laser range data and range from a neural stereoscopic vision system is presented. The estimate robot position is used to safely navigate through the environment. The stereoscopic sub system delivers dense information in the entire field of view. This feature allows a safer navigation of the robot, since, for example, arch-like obstacles may be avoided. Experimental results are presented concerning localization and obstacle detection. The precision attained by the system allows safe navigation. The use of visual data is of great importance in many operative scenarios. © 2005 IEEE.
Original languageEnglish
DOIs
Publication statusPublished - 2005
Event12th International Conference on Advanced Robotics, 2005. ICAR '05 - , United States
Duration: 1 Jan 2005 → …

Conference

Conference12th International Conference on Advanced Robotics, 2005. ICAR '05
CountryUnited States
Period1/1/05 → …

    Fingerprint

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Cite this

Pagnottelli, S., Taraglio, S., Valigi, P., & Zanela, A. (2005). Visual and laser sensory data fusion for outdoor robot localisation and navigation. Paper presented at 12th International Conference on Advanced Robotics, 2005. ICAR '05, United States. https://doi.org/10.1109/ICAR.2005.1507409