A human-robot interaction interface for mobile and stationary robots based on real-time 3D human body and hand-finger pose estimation | IEEE Conference Publication | IEEE Xplore

A human-robot interaction interface for mobile and stationary robots based on real-time 3D human body and hand-finger pose estimation


Abstract:

This paper presents a real-time gesture-based human-robot interaction (HRI) interface for mobile and stationary robots. A human detection approach is used to estimate the...Show More

Abstract:

This paper presents a real-time gesture-based human-robot interaction (HRI) interface for mobile and stationary robots. A human detection approach is used to estimate the entire 3D point cloud of a human being inside the field of view of a moving camera. Afterwards, the pose of the human body is estimated using an efficient self-organizing map approach. Furthermore, a hand-finger pose estimation approach based on a self-scaling kinematic hand skeleton is presented and evaluated. A trained support vector machine is used to classify 29 hand-finger gestures based on the angles of the finger joints. The HRI interface is integrated into the ROS framework and qualitatively evaluated in a first test scenario on a mobile robot equipped with an RGB-D camera for gesture interaction. Since the hand-finger pose, the hand-finger gesture, as well as the whole body pose are estimated, the interface allows a flexible implementation of various applications.
Date of Conference: 06-09 September 2016
Date Added to IEEE Xplore: 07 November 2016
ISBN Information:
Conference Location: Berlin, Germany

Contact IEEE to Subscribe

References

References is not available for this document.