Abstract:
This letter presents a novel method for visual odometry estimation from a RGB-D camera. The camera motion is estimated by aligning a source to a target RGB-D frame using ...Show MoreMetadata
Abstract:
This letter presents a novel method for visual odometry estimation from a RGB-D camera. The camera motion is estimated by aligning a source to a target RGB-D frame using an intensity-assisted iterative closest point (ICP) algorithm. The proposed method differs from the conventional ICP in following aspects. 1) To reduce the computational cost, salient point selection is performed on the source frame, where only points that contain valuable information for registration are used. 2) To reduce the influence of outliers and noises, robust weighting function is proposed to weight corresponding pairs based on statistics of their spatial distances and intensity differences. 3) The obtained robust weighting function from 2) is used for correspondence estimation of the following ICP iteration. The proposed method runs in real-time with a single core CPU thread, hence it is suitable for robots with limited computation resources. The evaluation on TUM RGB-D benchmark shows that in the majority of the tested sequences, our proposed method outperforms state-of-the-art accuracy in terms of translational drift per second with a computation speed of 78 Hz.
Published in: IEEE Robotics and Automation Letters ( Volume: 1, Issue: 2, July 2016)