Abstract
Perception and generation of affective movements are essential for achieving the expressivity required for a fully engaging human-machine interaction. This paper develops a computational model for recognizing and generating affective hand movements for display on anthropomorphic and non-anthropomorphic structures. First, time-series features of these movements are aligned and converted to fixed-length vectors using piece-wise linear re-sampling. Next, a feature transformation best capable of discriminating between the affective movements is obtained using functional principal component analysis (FPCA). The resulting low-dimensional feature transformation is used for classification and regeneration. A dataset consisting of one movement type, closing and opening the hand, is considered for this study. Three different expressions, sadness, happiness and anger, were conveyed by a demonstrator through the same general movement. The performance of the developed model is evaluated objectively using leave-one-out cross validation and subjectively through a user study, where participants evaluated the regenerated affective movements as well as the original affective movements reproduced both on a human-like model and a non-anthropomorphic structure. The proposed approach achieves zero leave-one-out cross validation errors, on both the training and testing sets. No significant difference is observed between participants’ evaluation of the regenerated movements as compared to the original movement, which confirms successful regeneration of the affective movement. Furthermore, a significant effect of structure on the perception of affective movements is observed.
Similar content being viewed by others
References
Araki Y, Konishi S, Kawano S, Matsui H (2009) Functional regression modeling via regularized Gaussian basis expansions. Ann Inst Stat Math 61(4):811–833
Argyle M (1988) Bodily communication. Taylor & Francis, London
Beesley P (2009) Hylozoic soil. Leonardo J Sci 42(4):360–361
Beesley P (2010) Kinetic architectures and geotextile installations. Riverside Architectural Press
Bernardin K, Ogawara K, Ikeuchi K, Dillmann R (2005) A sensor fusion approach for recognizing continuous human grasping sequences using hidden Markov models. IEEE Trans Robot 21(1):47–57
Bethel C, Murphy R (2008) Survey of non-facial/non-verbal affective expressions for appearance-constrained robots. IEEE Trans Syst Man Cybern, Part C, Appl Rev 38(1):83–92
Birdwhistell R (1970) Kinesics and context. A. Lane. Penguin Press
Blake R, Shiffrar M (2007) Perception of human motion. Annu Rev Psychol 58:47–73
Blythe P, Todd P, Miller G (1999) How motion reveals intention: categorizing social interactions. In: Simple heuristics that make us smart, pp 257–285
Bookstein F (1997) Morphometric tools for landmark data: geometry and biology. Cambridge University Press, Cambridge
Bradley M, Lang P (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatr 25(1):49–59
Breazeal C (2004) Designing sociable robots. MIT Press, Cambridge
Buck R (1984) The communication of emotion. Guilford Press, New York
Bull P (1987) Posture and gesture. Pergamon, Elmsford
Calvo A, D’Mello S (2010) Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans Affect Comput 1(1):18–37
Camurri A, Lagerlöf I, Volpe G (2003) Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques. Int J Hum-Comput Stud 59(1):213–225
Canamero L, Fredslund J (2000) How does it feel? Emotional interaction with a humanoid Lego robot. In: AAAI fall symposium socially intelligent agents—the human in the loop. AAAI Press, Menlo Park, pp 23–28
Carmichael L, Roberts S, Wessell N (1937) A study of the judgment of manual expression as presented in still and motion pictures. Br J Soc Psychol 8(1):115–142
Chen C, Zhuang Y, Nie F, Yang Y, Wu F, Xiao J (2011) Learning a 3D human pose distance metric from geometric pose descriptor. IEEE Trans Vis Comput Graph 17(11):1676–1689
Cohen J (1988) Statistical power analysis for the behavioral sciences. Erlbaum, Hilsdale
Colibazzi T, Posner J, Wang Z, Gorman D, Gerber A, Yu S, Zhu H, Kangarlu A, Duan Y, Russell J (2010) Neural systems subserving valence and arousal during the experience of induced emotions. Emotion 10(3):377–389
Coulson M (2004) Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J Nonverbal Behav 28(2):117–139
Daly E, Lancee W, Polivy J (1983) A conical model for the taxonomy of emotional experience. J Pers Soc Psychol 45(2):443–457
Damiani S, Deregibus E, Andreone L (2009) Driver–vehicle interfaces and interaction: where are they going? Eur Transp Res Rev 1(2):87–96
Darwin C (1872/1965) The expression of emotions in man and animals. Chicago University Press, Chicago
De Boor C (2001) A practical guide to splines, vol 27. Springer, Berlin
Dick A, Brooks M (2003) Issues in automated visual surveillance. In: International conference on digital image computing: techniques and applications, pp 195–204
Ekman P (1992) Are there basic emotions? Psychol Rev 99(3):550–553
Ekman P, Friesen W (1969) The repertoire of nonverbal behavior: categories, origins, usage, and coding. Semiotica 1(1):49–98
Fast J (1988). Body language. Pocket
Faul F, Erdfelder E, Lang A, Buchner A (2007) G* power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav Res Methods 39(2):175–191
Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42(3):143–166
Gasser T, Kneip A (1995) Searching for structure in curve samples. J Am Stat Assoc 90(432):1179–1188
Harada T, Taoka S, Mori T, Sato T (2004) Quantitative evaluation method for pose and motion similarity based on human perception. In: 4th IEEE/RAS international conference on humanoid robots, vol 1, pp 494–512
Heider F, Simmel M (1944) An experimental study of apparent behavior. Am J Psychol 57(2):243–259
Hietanen J, Leppänen J, Lehtonen U (2004) Perception of emotions in the hand movement quality of Finnish sign language. J Nonverbal Behav 28(1):53–64
Hodgins J, O’Brien J, Tumblin J (1998) Perception of human motion with different geometric models. IEEE Trans Vis Comput Graph 4(4):307–316
Iba S, Weghe J, Paredis C, Khosla P (1999) An architecture for gesture-based control of mobile robots. In: Proceedings of IEEE/RDJ international conference on intelligent robots and systems (IROS’99), vol 2. IEEE Press, New York, pp 851–857
Inamura T, Toshima I, Tanie H, Nakamura Y (2004) Embodied symbol emergence based on mimesis theory. Int J Robot Res 23(4–5):363
Ivanenko Y, Cappellini G, Dominici N, Poppele R, Lacquaniti F (2005) Coordination of locomotion with voluntary movements in humans. J Neurosci 25(31):7238–7253
James W (1884) What is an emotion? Mind 9(34):188–205
Jansen T (2007) The great pretender. OIO Publishers
Jenkins OC, Matarić MJ (2004) A spatio-temporal extension to Isomap nonlinear dimension reduction. In: Proceedings of the 21st international conference on machine learning (ICML’04). ACM, New York, pp 441–448
Jolliffe I, MyiLibrary (2002) Principal component analysis, vol 2. Wiley Online Library
Krauss R, Hadar U (1999) The role of speech-related arm/hand gestures in word retrieval. Gesture, speech, and sign pp 93–116
Kulić D, Croft E (2007) Physiological and subjective responses to articulated robot motion. Robotica 25(1):13–27
Kulić D, Takano W, Nakamura Y (2008) Incremental learning, clustering and hierarchy formation of whole body motion patterns using adaptive hidden Markov chains. Int J Robot Res 27(7):761–784
Laban R, Lawrence F (1947) Effort. Macdonald and Evans
Lee E (1982) A simplified b-spline computation routine. Computing 29(4):365–371
Lee J, Chai J, Reitsma P, Hodgins J, Pollard N (2002) Interactive control of avatars animated with human motion data. ACM Trans Graph 21(3):491–500
Lee J, Park J, Nam T (2007) Emotional interaction through physical movement. Hum-Comput Interact 3:401–410
Lewis M (1995) Self-conscious emotions. Am Sci 83(1):68–78
Losch M, Schmidt-Rohr S, Knoop S, Vacek S, Dillmann R (2007) Feature set selection and optimal classifier for human activity recognition. In: The 16th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE Press, New York, pp 1022–1027
Lu G, Shark L, Hall G, Zeshan U (2009) Dynamic hand gesture tracking and recognition for real-time immersive virtual object manipulation. In: International conference on CyberWorlds. IEEE Press, New York, pp 29–35
McDonnell R, Jörg S, McHugh J, Newell F, O’Sullivan C (2008) Evaluating the emotional content of human motions on real and virtual characters. In: Proceedings of the 5th symposium on applied perception in graphics and visualization. ACM, New York, pp 67–74
Measurand (2009) Motion capture systems. http://www.measurand.com
Mitra S, Acharya T (2007) Gesture recognition: A survey. IEEE Trans Syst Man Cybern, Part C, Appl Rev 37(3):311–324
Müller M, Röder T, Clausen M (2005) Efficient content-based retrieval of motion capture data. ACM Trans Graph 24(3):677–685
Nakanishi J, Morimoto J, Endo G, Cheng G, Schaal S, Kawato M (2004) Learning from demonstration and adaptation of biped locomotion. Robot Auton Syst 47(2–3):79–91
Ogata T, Sugano S, Tani J (2005) Open–end human–robot interaction from the dynamical systems perspective: mutual adaptation and incremental learning. Adv Robot 19(6):651–670
Ong S, Ranganath S (2005) Automatic sign language analysis: A survey and the future beyond lexical meaning. IEEE Trans Pattern Anal Mach Intell 27(6):873–891
Perani D, Fazio F, Borghese N, Tettamanti M, Ferrari S, Decety J, Gilardi M (2001) Different brain correlates for watching real and virtual hand actions. NeuroImage 14(3):749–758
Picard R (2000) Affective computing. MIT Press, Cambridge
Pollick F, Paterson H, Bruderlin A, Sanford A (2001) Perceiving affect from arm movement. Cognition 82(2):B51–B61
Ramsay J (1997) Functional data analysis, 2nd edn. Springer, New York
Ramsay J (2008) Functional data analysis software. http://www.psych.mcgill.ca/misc/fda/software.html
Ramsay J, Hooker G, Graves S (2009) Functional data analysis with R and MATLAB. Springer, Berlin
Reisenzein R (1994) Pleasure-arousal theory and the intensity of emotions. J Pers Soc Psychol 67(3):525–539
Roether C, Omlor L, Christensen A, Giese M (2009) Critical features for the perception of emotion from gait. J Vis 9(6):1–32
Russell J (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161–1178
Saerbeck M, Bartneck C (2010) Perception of affect elicited by robot motion. In: International conference on human-robot interaction. ACM, New York, pp 53–60
Samadani A (2011) Questionnaire’s videos. https://ece.uwaterloo.ca/~asamadan/JulyVideos.htm
Samadani A, DeHart B, Robinson K, Kulić D, Kubica E, Gorbet R (2011) A study of human performance in recognizing expressive hand movements. In: RO-MAN 2011, pp 93–100, IEEE Press, New York
Santello M, Flanders M, Soechting J (2002) Patterns of hand motion during grasping and the influence of sensory guidance. J Neurosci 22(4):1426–1435
Shaver P, Schwartz J, Kirson D, O’Connor C (1987) Emotion knowledge: further exploration of a prototype approach. J Pers Soc Psychol 52(6):1061–1086
Shibata T, Yoshida M, Yamato J (1997) Artificial emotional creature for human-machine interaction. In: IEEE international conference on systems, man, and cybernetics. Computational cybernetics and simulation. IEEE Press, New York, vol 3, pp 2269–2274
Smith L, Breazeal C (2007) The dynamic lift of developmental process. Dev Sci 10(1):61–68
Spiegel J, Machotka P (1974) Messages of the body. Free Press, New York
SPSS (2010) Spss for windows, rel. 19.0. Chicago: SPSS inc
Thomsen M (2007) Metabolistic architecture. In: Responsive textile environments? Tuns Press
Urtasun R, Fleet D, Fua P (2006) Temporal motion models for monocular and multiview 3D human body tracking. Comput Vis Image Underst 104(2):157–177
Wallbott H (1998) Bodily expression of emotion. Eur J Soc Psychol 28(6):879–896
Wanderley M, Depalle P (2004) Gestural control of sound synthesis. Proc IEEE 92(4):632–644
Weerdesteijn J, Desmet P, Gielen M (2005) Moving design: to design emotion through movement. Design J 8(1):28–40
Acknowledgements
The work was funded in part by the Natural Sciences and Engineering Research Council of Canada and the Canada Council for the Arts, through the collaborative New Media Initiative. The authors also wish to thank the anonymous reviewers and Stefanie Blain-Moraes for their careful reading of the manuscript. Their detailed suggestions significantly improved the quality and readability of the paper.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Samadani, AA., Kubica, E., Gorbet, R. et al. Perception and Generation of Affective Hand Movements. Int J of Soc Robotics 5, 35–51 (2013). https://doi.org/10.1007/s12369-012-0169-4
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-012-0169-4