Loading [a11y]/accessibility-menu.js
Visual pointing gestures for bi-directional human robot interaction in a pick-and-place task | IEEE Conference Publication | IEEE Xplore

Visual pointing gestures for bi-directional human robot interaction in a pick-and-place task


Abstract:

This paper explores visual pointing gestures for two-way nonverbal communication for interacting with a robot arm. Such non-verbal instruction is common when humans commu...Show More

Abstract:

This paper explores visual pointing gestures for two-way nonverbal communication for interacting with a robot arm. Such non-verbal instruction is common when humans communicate spatial directions and actions while collaboratively performing manipulation tasks. Using 3D RGBD we compare human-human and human-robot interaction for solving a pick-and-place task. In the human-human interaction we study both pointing and other types of gestures, performed by humans in a collaborative task. For the human-robot interaction we design a system that allows the user to interact with a 7DOF robot arm using gestures for selecting, picking and dropping objects at different locations. Bi-directional confirmation gestures allow the robot (or human) to verify that the right object is selected. We perform experiments where 8 human subjects collaborate with the robot to manipulate ordinary household objects on a tabletop. Without confirmation feedback selection accuracy was 70-90% for both humans and the robot. With feedback through confirmation gestures both humans and our vision-robotic system could perform the task accurately every time (100%). Finally to illustrate our gesture interface in a real application, we let a human instruct our robot to make a pizza by selecting different ingredients.
Date of Conference: 31 August 2015 - 04 September 2015
Date Added to IEEE Xplore: 23 November 2015
ISBN Information:
Conference Location: Kobe, Japan

References

References is not available for this document.