Abstract
In order to make the interaction between human and robot easier and more diversity, in this paper, we construct a system in which human users can use the body language to control robot doing some works. At first, the human body data is collected via 3D camera. And then, we extract the skeleton feature. Based on the human posture and Semaphore system, the international communicative method, robot can get the character in the Alphabet. Finally, robot combines the separate character into the understandable message and executes what user wants to do. In simulation, we show the results in which the iRobot can move up, down, turn left, turn right...based on the received body message.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Kiesler, S., Hints, P.: Introduction on Human-Robot Interaction. Special Issue of Human-Computer Interaction 19(1&2) (2004)
Johnson, C., BugraKoku, A., Kawamura, K., Peters, R.A.: Enhancing a Human-Robot Interface Using Sensory Egosphere. In: Proceeding of 2002 IEEE International Conference on Robotics & Automation, Washinton DC (May 2002)
Stiefelhagen, R., Fügen, C., Gieselmann, P., Holzapfel, H., Nickel, K., Waibel, A.: Natural Human-Robot Interaction Using Speech, Head Pose and Gestures. In: Proceedings of 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, Sendai, Japan, September 28-Octoter 2 (2004)
Sato, E., Nakajima, A., Yamaguchi, T.: Humatronics– Natural Interaction Between Human and Networked Robot Using Human Motion Recognition. In: Proceedings of 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Alberta, Canada, August 2-6 (2005)
Hinds, P.J., Roberts, T.L., Roberts, T.L.: Whose job is it anyway? A study of Human–Robot Interaction in a Collaborative Task. Human-Computer Interaction 19, 151–181 (2004)
Noyes, J.: Talking and Writing-How Natural in Human-Machine Interaction? Int. J. Human-Computer Studies 55, 503–519 (2001)
Scheutz, M., Schermerhorn, P., Kramer, J.: The Utility of Affect Expression in Natural Language Interactions in Joint Human-Robot Tasks. In: HRI 2006, Salt Lake City, Utah, USA, March 2-4 (2006)
Shotton, J., Fitzgibbon, A., Cook, M., Sharp, T., Finocchio, M., Moore, R., Kipman, A., Blake, A.: Real-Time Human Pose Recognition in Parts From Single Depth Images
Mojarrad, M., Dezfouli, M.A., Rahmani, A.M.: Feature’s Extraction of Human Body Composition in Images by Segmentation Method. World Academy of Science, Engineering and Technology 45 (2008)
DucThang, N., Kim, T.-S., Lee, Y.-K., Lee, S.-Y.: Estimation of 3-D Human Body Posture via Co-registration of 3-D Human Model and Sequential Stereo Information
ROS Supporting Documentation and Information, http://www.ros.org/wiki/
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Nguyen-Duc-Thanh, N., Stonier, D., Lee, S., Kim, DH. (2011). A New Approach for Human-Robot Interaction Using Human Body Language. In: Lee, G., Howard, D., Ślęzak, D. (eds) Convergence and Hybrid Information Technology. ICHIT 2011. Lecture Notes in Computer Science, vol 6935. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-24082-9_92
Download citation
DOI: https://doi.org/10.1007/978-3-642-24082-9_92
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-24081-2
Online ISBN: 978-3-642-24082-9
eBook Packages: Computer ScienceComputer Science (R0)