Loading [a11y]/accessibility-menu.js
Living Object Grasping Using Two-Stage Graph Reinforcement Learning | IEEE Journals & Magazine | IEEE Xplore

Living Object Grasping Using Two-Stage Graph Reinforcement Learning


Abstract:

Living objects are hard to grasp because they can actively dodge and struggle by writhing or deforming while or even prior to being contacted and modeling or predicting t...Show More

Abstract:

Living objects are hard to grasp because they can actively dodge and struggle by writhing or deforming while or even prior to being contacted and modeling or predicting their responses to grasping is extremely difficult. This letter presents an algorithm based on reinforcement learning (RL) to attack this challenging problem. Considering the complexity of living object grasping, we divide the whole task into pre-grasp and in-hand stages and let the algorithm switch between the stages automatically. The pre-grasp stage is aimed at finding a good pose of a robot hand approaching a living object for performing a grasp. Dense reward functions are proposed for facilitating the learning of right hand actions based on the poses of both hand and object. Since an object held in hand may struggle to escape, the robot hand needs to adjust its configuration and respond correctly to the object's movement. Hence, the goal of the in-hand stage is to determine an appropriate adjustment of finger configuration in order for the robot hand to keep holding the object. At this stage, we treat the robot hand as a graph and use the graph convolutional network (GCN) to determine the hand action. We test our algorithm with both simulation and real experiments, which show its good performance in living object grasping. More results are available on our website: https://sites.google.com/view/graph-rl.
Published in: IEEE Robotics and Automation Letters ( Volume: 6, Issue: 2, April 2021)
Page(s): 1950 - 1957
Date of Publication: 19 February 2021

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.