Title: Multi-Modal Sensor Fusion for Indoor Mobile Robot Pose Estimation
Author(s): Yassen Dobrev, Sergio Flores, and Martin Vossiek
Published in: Proceedings of IEEE/ION PLANS 2016
April 11 - 14, 2016
Hyatt Regency Hotel
Savannah, GA
Pages: 553 - 556
Cite this article: Dobrev, Yassen, Flores, Sergio, Vossiek, Martin, "Multi-Modal Sensor Fusion for Indoor Mobile Robot Pose Estimation," Proceedings of IEEE/ION PLANS 2016, Savannah, GA, April 2016, pp. 553-556.
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In
Abstract: While global navigation satellite systems (GNSS) are the state of the art for localization, in general they are unable to operate inside buildings, and there is currently no well-established solution for indoor localization. In this paper we propose a 3D mobile robot pose (2D position and 1D orientation) estimation system for indoor applications. The system is based on the cooperative sensor fusion of radar, ultrasonic and odometry data using an extended Kalman filter (EKF). A prerequisite for the EKF is an occupancy grid map of the scenario as well as the pose of the reference radar node inside the map. Our system can handle even the kidnapped-robot case as the radar provides absolute localization. We conducted a series of measurements in an office building corridor. We determined the typical position root-mean square error (RMSE) to be less than 15 cm.