Abstract:
This paper presents a real-time gesture-based human-robot interaction (HRI) interface for mobile and stationary robots. A human detection approach is used to estimate the...Show MoreMetadata
Abstract:
This paper presents a real-time gesture-based human-robot interaction (HRI) interface for mobile and stationary robots. A human detection approach is used to estimate the entire 3D point cloud of a human being inside the field of view of a moving camera. Afterwards, the pose of the human body is estimated using an efficient self-organizing map approach. Furthermore, a hand-finger pose estimation approach based on a self-scaling kinematic hand skeleton is presented and evaluated. A trained support vector machine is used to classify 29 hand-finger gestures based on the angles of the finger joints. The HRI interface is integrated into the ROS framework and qualitatively evaluated in a first test scenario on a mobile robot equipped with an RGB-D camera for gesture interaction. Since the hand-finger pose, the hand-finger gesture, as well as the whole body pose are estimated, the interface allows a flexible implementation of various applications.
Published in: 2016 IEEE 21st International Conference on Emerging Technologies and Factory Automation (ETFA)
Date of Conference: 06-09 September 2016
Date Added to IEEE Xplore: 07 November 2016
ISBN Information: