An architecture for robot localization and navigation performing fusion among odometry, laser range data and range from a neural stereoscopic vision system is presented. The estimate robot position is used to safely navigate through the environment. The stereoscopic sub system delivers dense information in the entire field of view. This feature allows a safer navigation of the robot, since, for example, arch-like obstacles may be avoided. Experimental results are presented concerning localization and obstacle detection. The precision attained by the system allows safe navigation. The use of visual data is of great importance in many operative scenarios. © 2005 IEEE.
|Publication status||Published - 2005|
|Event||12th International Conference on Advanced Robotics, 2005. ICAR '05 - , United States|
Duration: 1 Jan 2005 → …
|Conference||12th International Conference on Advanced Robotics, 2005. ICAR '05|
|Period||1/1/05 → …|
All Science Journal Classification (ASJC) codes
Pagnottelli, S., Taraglio, S., Valigi, P., & Zanela, A. (2005). Visual and laser sensory data fusion for outdoor robot localisation and navigation. Paper presented at 12th International Conference on Advanced Robotics, 2005. ICAR '05, United States. https://doi.org/10.1109/ICAR.2005.1507409