Loading [a11y]/accessibility-menu.js
Multi-modal sensor fusion for indoor mobile robot pose estimation | IEEE Conference Publication | IEEE Xplore

Multi-modal sensor fusion for indoor mobile robot pose estimation


Abstract:

While global navigation satellite systems (GNSS) are the state of the art for localization, in general they are unable to operate inside buildings, and there is currently...Show More

Abstract:

While global navigation satellite systems (GNSS) are the state of the art for localization, in general they are unable to operate inside buildings, and there is currently no well-established solution for indoor localization. In this paper we propose a 3D mobile robot pose (2D position and 1D orientation) estimation system for indoor applications. The system is based on the cooperative sensor fusion of radar, ultrasonic and odometry data using an extended Kalman filter (EKF). A prerequisite for the EKF is an occupancy grid map of the scenario as well as the pose of the reference radar node inside the map. Our system can handle even the kidnapped-robot case as the radar provides absolute localization. We conducted a series of measurements in an office building corridor. We determined the typical position root-mean square error (RMSE) to be less than 15 cm.
Date of Conference: 11-14 April 2016
Date Added to IEEE Xplore: 30 May 2016
Electronic ISBN:978-1-5090-2042-3
Electronic ISSN: 2153-3598
Conference Location: Savannah, GA, USA

Contact IEEE to Subscribe

References

References is not available for this document.