Hier finden Sie wissenschaftliche Publikationen aus den Fraunhofer-Instituten.

Visual localization using Bayesian decision fusion on omnidirectional sensing

: Paletta, L.; Frintrop, S.; Hertzberg, J.

Dasarathy, B.V. ; Society of Photo-Optical Instrumentation Engineers -SPIE-, Bellingham/Wash.:
Sensor fusion: Architectures, algorithms, and applications V : 18 - 20 April 2001, Orlando, USA
Bellingham/Wash.: SPIE, 2001 (SPIE Proceedings Series 4385)
ISBN: 0-8194-4080-9
Technical Conference "Sensor Fusion: Architectures, Algorithms, and Applications" <2001, Orlando/Fla.>
Conference Paper
Fraunhofer AIS ( IAIS) ()

Omnidirectional visual sensors have been successfully introduced recently to robot navigation, providing improved localization performances and a more stable path following behavior. As a consequence of the sensor characteristics, occlusion of the entire panoramic visual field becomes very unlikely. The presented work exploits these characteristics providing a Bayesian framework to gain even partial evidence about a current location by applying decision fusion on the multi-directional visual context. The panoramic image is first partitioned into a fixed number of overlapping unidirectional camera views, i.e., appearance sectors. For each sector image one learns then a posterior distribution over potential locations within a predefined environment. The ambiguity in a local sector interpretation is then resolved by Bayesian reasoning over the spatial context of the current position, discriminating occlusions which do not fit to the appearance model of subsequent sector views. The results from navigation experiments in an office using a robot equipped with an omnidirectional camera demonstrate that the Bayesian reasoning allows highly occlusion tolerant localization to enable visual navigation of autonomous robots even at crowded places such as offices, factories, and urban environments.