Abstract:
This paper describes a robust localization approach for a moving target based on RGB-depth (RGB-D) camera and 2D light detection and ranging (LiDAR) sensor measurements. ...Show MoreMetadata
Abstract:
This paper describes a robust localization approach for a moving target based on RGB-depth (RGB-D) camera and 2D light detection and ranging (LiDAR) sensor measurements. In the proposed approach, the 3D and 2D position information of a target measured by RGB-D camera and LiDAR sensor, respectively are utilized to find location of target by incorporating visual tracking algorithms, depth information of the structured light sensor and vision-LiDAR low-level fusion algorithm (e.g., extrinsic calibration). For robustness of localization, a novel approach making use of Kalman prediction and filtering with intermittent observations which are identified from depth image segmentation is proposed. The proposed depth-aided localization algorithm shows robust tracking results even if visual tracking using RGB camera fails. The experimental verification results are compared to position data from VICON motion captureas a ground truth and the results show that performance superiority and robustness of the proposed approach.
Published in: The 2014 International Conference on Control, Automation and Information Sciences (ICCAIS 2014)
Date of Conference: 02-05 December 2014
Date Added to IEEE Xplore: 26 January 2015
Electronic ISBN:978-1-4799-7204-3