Abstract
In this chapter, a nonverbal way of communication for human–robot interaction by understanding human upper body gestures will be addressed. The human–robot interaction system based on a novel combination of sensors is proposed. It allows one person to interact with a humanoid social robot with natural body language. The robot can understand the meaning of human upper body gestures and express itself by using a combination of body movements, facial expressions, and verbal language. A set of 12 upper body gestures is involved for communication. Human–object interactions are also included in these gestures. The gestures can be characterized by the head, arm, and hand posture information. CyberGlove II is employed to capture the hand posture. This feature is combined with the head and arm posture information captured from Microsoft Kinect. This is a new sensor solution for human-gesture capture. Based on the body posture data, an effective and real-time human gesture recognition method is proposed. For experiments, a human body gesture dataset was built. The experimental results demonstrate the effectiveness and efficiency of the proposed approach.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsNotes
- 1.
- 2.
- 3.
- 4.
The source code is available at http://www.cse.wustl.edu/~kilian/code/lmnn/lmnn.html
References
Adamo-Villani N, Heisler J, Arns L (2007) Two gesture recognition systems for immersive math education of the deaf. In: Proceedings of the first international conference on immersive telecommunications (ICIT 2007). ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), p 9
Beck A, Cañamero L, Hiolle A, Damiano L, Cosi P, Tesser F, Sommavilla G (2013) Interpretation of emotional body language displayed by a humanoid robot: a case study with children. Int J Soc Robot: 1–10
Belongies S, Malik J, Puzicha J (2002) Shape matching and object recognition using shape contexts. IEEE Trans Pattern Anal Mach Int 24(4):509–522
Berman S, Stern H (2012) Sensors for gesture recognition systems. IEEE Trans Syst Man Cybern: Appl Rev 42(3):277–290
Brethes L, Menezes P, Lerasle F, Hayet J (2004) Face tracking and hand gesture recognition for human-robot interaction. In: Proceedings of IEEE conference on robotics and automation (ICRA 2004), IEEE, vol 2. pp 1901–1906
Cañamero L, Fredslund J (2001) I show you how i like you—can you read it in my face? [robotics]. IEEE Trans Syst Man Cybern: Syst Hum 31(5):454–459
Cassell J et al. (2000) Nudge nudge wink wink: elements of face-to-face conversation for embodied conversational agents. Embodied conversational agents, pp 1–27
Chopra S, Hadsell R, LeCun Y (2005) Learning a similarity metric discriminatively, with application to face verification. In: Proceedings of IEEE conference on computer vision and pattern recognition (CVPR 2005), vol 1. pp 539–546
Cover T, Hart P (1967) Nearest neighbor pattern classification. IEEE Trans Inf Theory 13(1):21–27
Dautenhahn K (2007) Socially intelligent robots: dimensions of human-robot interaction. Philos Trans Royal Soc B: Biol Sci 362(1480):679–704
Faber F, Bennewitz M, Eppner C, Gorog A, Gonsior C, Joho D, Schreiber M, Behnke S (2009) The humanoid museum tour guide robotinho. In: Proceedings of IEEE symposium on robot and human interactive communication (RO-MAN 2009), IEEE, pp 891–896
Fisher RA (1936) The use of multiple measures in taxonomic problems. Ann Eugenics 7:179–188
Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42(3):143–166
Goodrich MA, Schultz AC (2007) Human-robot interaction: a survey. Found Trends Hum-Comput Interact 1(3):203–275
Immersion (2010) Cyberglove II specfications
Jolliffe IT (1986) Principal component analysis. Springer, London
Krämer NC, Tietz B, Bente G (2003) Effects of embodied interface agents and their gestural activity. In: Intelligent virtual agents. Springer, London, pp 292–300
Lu G, Shark L-K, Hall G, Zeshan U (2012) Immersive manipulation of virtual objects through glove-based hand gesture interaction. Virtual Reality 16(3):243–252
Mehrabian A (1971) Silent messages
Müller M, Röder T (2006) Motion templates for automatic classification and retrieval of motion capture data. In: Proceedings of the 2006 ACM SIGGRAPH/Eurographics symposium on computer animation (SCA 2006), Eurographics Association, pp 137–146
Nickel K, Stiefelhagen R (2007) Visual recognition of pointing gestures for human-robot interaction. Image Vis Comput 25(12):1875–1884
Osborne JW, Costello AB (2004) Sample size and subject to item ratio in principal components analysis. Pract Assess, Res Eval 9(11):8
Perzanowski D, Schultz AC, Adams W, Marsh E, Bugajska M (2001) Building a multimodal human-robot interface. IEEE Intell Syst 16(1):16–21
Shotton J, Fitzgibbon A, Cook M, Sharp T, Finocchio M, Moore R, Kipman A, Blake A (2011) Real-time human pose recognition in parts from single depth images. In: Proceedings of IEEE conference on computer vision and pattern recognition (CVPR 2011), pp 1297–1304
Smith LB, Breazeal C (2007) The dynamic lift of developmental process. Dev Sci 10(1):61–68
Spiliotopoulos D, Androutsopoulos I, Spyropoulos CD (2001) Human-robot interaction based on spoken natural language dialogue. In: Proceedings of the European workshop on service and humanoid robots, pp 25–27
Stiefelhagen R, Ekenel HK, Fugen C, Gieselmann P, Holzapfel H, Kraft F, Nickel K, Voit M, Waibel A (2007) Enabling multimodal human-robot interaction for the karlsruhe humanoid robot. IEEE Trans Robot 23(5):840–851
Stiefelhagen R, Fugen C, Gieselmann R, Holzapfel H, Nickel K, Waibel A (2004) Natural human-robot interaction using speech, head pose and gestures. In: Proceedings of IEEE conference on intelligent robots and systems (IROS 2004), IEEE, vol 3, pp 2422–2427
Teleb H, Chang G (2012) Data glove integration with 3d virtual environments. In: Proceeedings of international conference on systems and informatics (ICSAI 2012), IEEE, pp 107–112
Waldherr S, Romero R, Thrun S (2000) A gesture based interface for human-robot interaction. Auton Robots 9(2):151–173
Wang J, Liu Z, Wu Y, Yuan J (2012) Mining actionlet ensemble for action recognition with depth cameras. In: Proceedings of IEEE conference on computer vision and pattern recognition (CVPR 2012), pp 1290–1297
Wang J, Liu Z, Chorowski J, Chen Z, Wu Y (2012) Robust 3d action recognition with random occupancy patterns. In: Proceedings of European conference on computer vision (ECCV). Springer, London, pp 872–885
Weinberger KQ, Saul LK (2009) Distance metric learning for large margin nearest neighbor classification. J Mach Learn Res 10:207–244
Xiao Y, Cao Z, Zhuo W (2011) Type-2 fuzzy thresholding using glsc histogram of human visual nonlinearity characteristics. Opt. Express 19(11):10656–10672
Xiao Y, Cao Z, Yuan J (2014) Entropic image thresholding based on GLGM histogram. Pattern Recogn Lett 40:47–55
Xiao Y, Wu J, Yuan J (2014) mCENTRIST: a multi-channel feature generation mechanism for scene categorization. IEEE Trans Image Process 23(2):823–836
Xiao Y, Yuan J, Thalmann D (2013) Human-virtual human interaction by upper body gesture understanding. In: Proceedings of the 19th ACM symposium on virtual reality software and technology (VRST 2013), pp 133–142. ACM, Las Vegas
Xiao Y, Zhang Z, Beck A, Yuan J, Thalmann D (2014) Human-robot interaction by understanding upper body gestures. Presence: teleoperators and virtual environments (Accepted)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Xiao, Y., Liang, H., Yuan, J., Thalmann, D. (2016). Body Movement Analysis and Recognition. In: Magnenat-Thalmann, N., Yuan, J., Thalmann, D., You, BJ. (eds) Context Aware Human-Robot and Human-Agent Interaction. Human–Computer Interaction Series. Springer, Cham. https://doi.org/10.1007/978-3-319-19947-4_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-19947-4_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-19946-7
Online ISBN: 978-3-319-19947-4
eBook Packages: Computer ScienceComputer Science (R0)