DepthGrasp: Depth Completion of Transparent Objects Using Self-Attentive Adversarial Network with Spectral Residual for Grasping | IEEE Conference Publication | IEEE Xplore

DepthGrasp: Depth Completion of Transparent Objects Using Self-Attentive Adversarial Network with Spectral Residual for Grasping


Abstract:

Transparent objects with unique visual properties often make depth cameras fail to scan their reflective and refractive surfaces. Recent studies on depth completion of tr...Show More

Abstract:

Transparent objects with unique visual properties often make depth cameras fail to scan their reflective and refractive surfaces. Recent studies on depth completion of transparent objects have leveraged a linear system based on the geometric constraints to predict the missing depth, which is hard to be employed in an end-to-end framework and achieve joint optimization. In this paper, we propose DepthGrasp - a deep learning approach for depth completion of transparent objects from a raw RGB-D image. More specifically, we use a generative adversarial network, which utilizes the generator to complete the depth maps by predicting the missing or inaccurate depth values, and use discriminator to guide the completed depth maps against the groundtruth. In the generator, we devise spectral residual blocks (SRB) with spectral normalization for network stability, and residual block to pass the attention map in order to capture the structure information and distinguish the geometric shape of transparent objects. In the discriminator, we use a patch-based convolutional network to adapt the data distributions of the predicted depth maps according to groundtruth. Extensive experiments conducted on ClearGrasp dataset show the effectiveness and generalization of the DepthGrasp for depth completion, and the deployed robotic picking system makes significant improvement on the performance of grasping on transparent objects.
Date of Conference: 27 September 2021 - 01 October 2021
Date Added to IEEE Xplore: 16 December 2021
ISBN Information:

ISSN Information:

Conference Location: Prague, Czech Republic

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.