ABSTRACT
Most human interactions with the environment depend on our ability to navigate freely and to use our hands and arms to manipulate objects. Developing natural means of controlling these abilities in humanoid robots can significantly broaden the usability of such platforms. An ideal interface for humanoid robot teleoperation will be inexpensive, person-independent, require no wearable equipment, and will be easy to use, requiring little or no user training.
This work presents a new humanoid robot control and interaction interface that uses depth images and skeletal tracking software to control the navigation, gaze and arm gestures of a humanoid robot. To control the robot, the user stands in front of a depth camera and assumes a specific pose to initiate skeletal tracking. The initial location of the user automatically becomes the origin of the control coordinate system. The user can then use leg and arm gestures to turn the robot's motors on and off, to switch operation modes and to control the behavior of the robot. We present two control modes. The body control mode enables the user to control the arms and navigation direction of the robot using the person's own arms and location, respectively. The gaze direction control mode enables the user to control the focus of attention of the robot by pointing with one hand, while giving commands through gestures of the other hand. We present a demonstration of this interface, in which a combination of these two control modes is used to successfully enable an Aldebaran Nao robot to carry an object from one location to another. Our work makes use of the Microsoft Kinect depth sensor.
Supplemental Material
Index Terms
- Humanoid robot control using depth camera
Recommendations
Recognition of Human Motions for Imitation and Control of a Humanoid Robot
SBR-LARS '12: Proceedings of the 2012 Brazilian Robotics Symposium and Latin American Robotics SymposiumMost of the works in the field of teleoperation of humanoid robots do not use the whole-body of the robot to perform some task. This work treats the teleoperation of a humanoid robot through the recognition of human motions by using a natural interface ...
Similarities and differences in users' interaction with a humanoid and a pet robot
HRI '10: Proceedings of the 5th ACM/IEEE international conference on Human-robot interactionIn this paper, we compare user behavior towards the humanoid robot ASIMO and the dog-shaped robot AIBO in a simple task, in which the users has to teach commands and feedback to the robot.
Multiple Moving Obstacles Avoidance for Humanoid Service Robot Using Stereo Vision and Bayesian Approach
AMS '12: Proceedings of the 2012 Sixth Asia Modelling SymposiumIn this paper, we propose a robust multiple moving obstacles avoidance strategy using stereo vision for Humanoid Robot in indoor environment. We assume that this model of humanoid robot is used as a service robot to deliver a cup to customer from ...
Comments