Optimizing RGB-D Fusion for Accurate 6DoF Pose Estimation | IEEE Journals & Magazine | IEEE Xplore

Optimizing RGB-D Fusion for Accurate 6DoF Pose Estimation


Abstract:

Today's standard object localization systems often do not meet the industry's demands regarding 2D and 3D accuracy for digital manufacturing applications. Two targets are...Show More

Abstract:

Today's standard object localization systems often do not meet the industry's demands regarding 2D and 3D accuracy for digital manufacturing applications. Two targets are considered: digital-based assistance and robotic inspection. 2D precision is necessary to provide accurate assistance whilst 3D precision is crucial to get an inspection as much close to the object's true state. In this letter, we propose a new pose estimation system which ensures highest both 2D and 3D precision. While most RGB-based solutions focus on obtaining best 2D accuracy, RGBD-based systems mainly use depth information to maximize 3D accuracy. Very few solutions propose a way to jointly optimize both constraints. Nonetheless, pose estimation should produce high accuracy as a slight 2D error can result in a large 3D error (and inversely). To address this problem, we present a new system which uses RGB-D to fully take advantage of the depth information. A new 3D primitive is proposed in order to minimize the effect of RGB-D noise on 3D coordinates accuracy. CNN Keypoint Detector (KPD) method is used to localize this new primitive in order to achieve pose estimation task. Finally, we propose a novel refinement method which ensures optimal precision as both RGB and depth information are fused. We show the results of our experimentation on widely-used and challenging Linemod and Occlusion datasets. We demonstrate that our solution outperforms state-of-the-art methods when taking into account both 3D and 2D accuracy.
Published in: IEEE Robotics and Automation Letters ( Volume: 6, Issue: 2, April 2021)
Page(s): 2413 - 2420
Date of Publication: 23 February 2021

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.