Sensor Fusion for Mobile Robot Navigation

M. Kam, Xiaoxun Zhu, P. Kalata
1997 Proceedings of the IEEE  
We review techniques for sensor fusion in robot navigation, emphasizing algorithms for self-location. These find use when the sensor suite of a mobile robot comprises several different sensors, some complementary and some redundant. Integrating the sensor readings, the robot seeks to accomplish tasks such as constructing a map of its environment, locating itself in that map, and recognizing objects that should be avoided or sought. Our review describes integration techniques in two categories:
more » ... ow-level fusion is used for direct integration of sensory data, resulting in parameter and state estimates; high-level fusion is used for indirect integration of sensory data in hierarchical architectures, through command arbitration and integration of control signals suggested by different modules. The review provides an arsenal of tools for addressing this (rather ill-posed) problem in machine intelligence, including Kalman filtering, rule-based techniques, behavior based algorithms, and approaches that borrow from information theory, Dempster-Shafer reasoning, fuzzy logic and neural networks. It points to several further-research needs, including: robustness of decision rules; simultaneous consideration of self-location, motion planning, motion control and vehicle dynamics; the effect of sensor placement and attention focusing on sensor fusion; and adaptation of techniques from biological sensor fusion.
doi:10.1109/jproc.1997.554212 fatcat:adofsedn2jd2bauazbslm46jea