Abstract:
This paper presents a fast RGB-D dense visual odometry estimating 12-DoF state information including 3D motion and 6-DoF spatial velocity of a camera-strapdown system. To...Show MoreMetadata
Abstract:
This paper presents a fast RGB-D dense visual odometry estimating 12-DoF state information including 3D motion and 6-DoF spatial velocity of a camera-strapdown system. To reduce computational loads, we extract informative pixels through a zero-crossing difference of Gaussian (DoG) and nonmaximum gradient pixel extraction. For extracted regions, the 3D motion is estimated through inverse compositional algorithm and the result of motion estimation is exploited to calculate 6-DoF spatial velocity of the camera. Additionally, we relieve noise in the raw velocity using the Kalman filter. Afterwards, we validate the proposed algorithm using TUM RGB-D datasets and simulation results are reported. Our algorithm not only presents similar performances with the popular dense visual odometry, DVO, but also runs up to 2 times faster than DVO.
Published in: 2017 14th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)
Date of Conference: 28 June 2017 - 01 July 2017
Date Added to IEEE Xplore: 27 July 2017
ISBN Information: