Real-Time Collision-Free Grasp Pose Detection With Geometry-Aware Refinement Using High-Resolution Volume | IEEE Journals & Magazine | IEEE Xplore

Real-Time Collision-Free Grasp Pose Detection With Geometry-Aware Refinement Using High-Resolution Volume


Abstract:

In this letter, we proposea novel vision-based grasp system for closed-loop 6-degrees of freedom grasping of unknown objects in cluttered environments. The key factor in ...Show More

Abstract:

In this letter, we proposea novel vision-based grasp system for closed-loop 6-degrees of freedom grasping of unknown objects in cluttered environments. The key factor in our system is that we make the most of a geometry-aware scene representation based on a truncated signed distance function (TSDF) volume, which can handle multi-view observations from the vision sensor, provide comprehensive spatial information for the grasp pose detector, and allow collision checking to achieve a collision-free grasp pose. To eliminate the large computational burden caused by the volume-based data, a lightweight volume-point network (VPN) equipped with the Marching Cubes algorithm is proposed. This network predicts the point-wise grasp qualities for all the candidates in a single feed-forward operation with real-time performance, enabling the system to perform closed-loop grasping. Furthermore, a grasp pose refinement module is integrated to predict the pose residual based on the SDF observation of the gripper state in the TSDF volume. Extensive experiments show that the proposed method can achieve collision-free grasp detection with an efficiency of more than 30 Hz on a high-resolution volume. Furthermore, the model trained on only synthetic data achieves a 90.9% grasp success rate on cluttered real-world scenes, significantly outperforming other baseline methods. The supplementary materials are available at https://sites.google.com/view/vpn-icra2022
Published in: IEEE Robotics and Automation Letters ( Volume: 7, Issue: 2, April 2022)
Page(s): 1888 - 1895
Date of Publication: 13 January 2022

ISSN Information:

Funding Agency:


References

References is not available for this document.