Loading [a11y]/accessibility-menu.js
AViTa: Adaptive Visual-Tactile Dexterous Grasping | IEEE Journals & Magazine | IEEE Xplore

AViTa: Adaptive Visual-Tactile Dexterous Grasping


Abstract:

Human dexterity facilitates the quick identification of optimal grasp configurations and the precise exertion of force, regardless of the novelty of the objects encounter...Show More

Abstract:

Human dexterity facilitates the quick identification of optimal grasp configurations and the precise exertion of force, regardless of the novelty of the objects encountered. Although previous research has primarily focused on the use of visual cues, this work introduces an innovative approach to dexterous grasping that integrates both visual and tactile sensory data. To illustrate our methodology, we initially generate a comprehensive set of two-finger grasp poses from the partial-view point cloud of the scene, which are then transformed into dexterous grasp proposals. Subsequently, we adopt a methodical trial-and-error policy for object grasping, collecting crucial force feedback directly from the object's surface. Utilizing this feedback, our approach applies reinforcement learning algorithm to progressively refine the grasp pose. Notably, this exploration process typically requires a minimal number of iterations-comparable to human infant performance, ranging from 1 to 5 attempt-to adjust the grasp to the unique contours of the object's surface. The entire reinforcement learning process necessitates only a few hundred iterations for training. Our experimental findings reveal a high success rate of 88.33% across a diverse set of 25 everyday objects and 5 adversarial objects. Moreover, we observe a significant enhancement in both grasp stability and dexterity across various object types, thereby affirming the comprehensive effectiveness of our methodology.
Published in: IEEE Robotics and Automation Letters ( Volume: 9, Issue: 11, November 2024)
Page(s): 9462 - 9469
Date of Publication: 13 September 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.