Skip to main content
Log in

Perception and Generation of Affective Hand Movements

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Perception and generation of affective movements are essential for achieving the expressivity required for a fully engaging human-machine interaction. This paper develops a computational model for recognizing and generating affective hand movements for display on anthropomorphic and non-anthropomorphic structures. First, time-series features of these movements are aligned and converted to fixed-length vectors using piece-wise linear re-sampling. Next, a feature transformation best capable of discriminating between the affective movements is obtained using functional principal component analysis (FPCA). The resulting low-dimensional feature transformation is used for classification and regeneration. A dataset consisting of one movement type, closing and opening the hand, is considered for this study. Three different expressions, sadness, happiness and anger, were conveyed by a demonstrator through the same general movement. The performance of the developed model is evaluated objectively using leave-one-out cross validation and subjectively through a user study, where participants evaluated the regenerated affective movements as well as the original affective movements reproduced both on a human-like model and a non-anthropomorphic structure. The proposed approach achieves zero leave-one-out cross validation errors, on both the training and testing sets. No significant difference is observed between participants’ evaluation of the regenerated movements as compared to the original movement, which confirms successful regeneration of the affective movement. Furthermore, a significant effect of structure on the perception of affective movements is observed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Araki Y, Konishi S, Kawano S, Matsui H (2009) Functional regression modeling via regularized Gaussian basis expansions. Ann Inst Stat Math 61(4):811–833

    Article  MathSciNet  Google Scholar 

  2. Argyle M (1988) Bodily communication. Taylor & Francis, London

    Google Scholar 

  3. Beesley P (2009) Hylozoic soil. Leonardo J Sci 42(4):360–361

    Article  Google Scholar 

  4. Beesley P (2010) Kinetic architectures and geotextile installations. Riverside Architectural Press

  5. Bernardin K, Ogawara K, Ikeuchi K, Dillmann R (2005) A sensor fusion approach for recognizing continuous human grasping sequences using hidden Markov models. IEEE Trans Robot 21(1):47–57

    Article  Google Scholar 

  6. Bethel C, Murphy R (2008) Survey of non-facial/non-verbal affective expressions for appearance-constrained robots. IEEE Trans Syst Man Cybern, Part C, Appl Rev 38(1):83–92

    Article  Google Scholar 

  7. Birdwhistell R (1970) Kinesics and context. A. Lane. Penguin Press

  8. Blake R, Shiffrar M (2007) Perception of human motion. Annu Rev Psychol 58:47–73

    Article  Google Scholar 

  9. Blythe P, Todd P, Miller G (1999) How motion reveals intention: categorizing social interactions. In: Simple heuristics that make us smart, pp 257–285

    Google Scholar 

  10. Bookstein F (1997) Morphometric tools for landmark data: geometry and biology. Cambridge University Press, Cambridge

    Google Scholar 

  11. Bradley M, Lang P (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatr 25(1):49–59

    Article  Google Scholar 

  12. Breazeal C (2004) Designing sociable robots. MIT Press, Cambridge

    Google Scholar 

  13. Buck R (1984) The communication of emotion. Guilford Press, New York

    Google Scholar 

  14. Bull P (1987) Posture and gesture. Pergamon, Elmsford

    Google Scholar 

  15. Calvo A, D’Mello S (2010) Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans Affect Comput 1(1):18–37

    Article  Google Scholar 

  16. Camurri A, Lagerlöf I, Volpe G (2003) Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques. Int J Hum-Comput Stud 59(1):213–225

    Article  Google Scholar 

  17. Canamero L, Fredslund J (2000) How does it feel? Emotional interaction with a humanoid Lego robot. In: AAAI fall symposium socially intelligent agents—the human in the loop. AAAI Press, Menlo Park, pp 23–28

    Google Scholar 

  18. Carmichael L, Roberts S, Wessell N (1937) A study of the judgment of manual expression as presented in still and motion pictures. Br J Soc Psychol 8(1):115–142

    Google Scholar 

  19. Chen C, Zhuang Y, Nie F, Yang Y, Wu F, Xiao J (2011) Learning a 3D human pose distance metric from geometric pose descriptor. IEEE Trans Vis Comput Graph 17(11):1676–1689

    Article  Google Scholar 

  20. Cohen J (1988) Statistical power analysis for the behavioral sciences. Erlbaum, Hilsdale

    MATH  Google Scholar 

  21. Colibazzi T, Posner J, Wang Z, Gorman D, Gerber A, Yu S, Zhu H, Kangarlu A, Duan Y, Russell J (2010) Neural systems subserving valence and arousal during the experience of induced emotions. Emotion 10(3):377–389

    Article  Google Scholar 

  22. Coulson M (2004) Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J Nonverbal Behav 28(2):117–139

    Article  MathSciNet  Google Scholar 

  23. Daly E, Lancee W, Polivy J (1983) A conical model for the taxonomy of emotional experience. J Pers Soc Psychol 45(2):443–457

    Article  Google Scholar 

  24. Damiani S, Deregibus E, Andreone L (2009) Driver–vehicle interfaces and interaction: where are they going? Eur Transp Res Rev 1(2):87–96

    Article  Google Scholar 

  25. Darwin C (1872/1965) The expression of emotions in man and animals. Chicago University Press, Chicago

    Google Scholar 

  26. De Boor C (2001) A practical guide to splines, vol 27. Springer, Berlin

    MATH  Google Scholar 

  27. Dick A, Brooks M (2003) Issues in automated visual surveillance. In: International conference on digital image computing: techniques and applications, pp 195–204

    Google Scholar 

  28. Ekman P (1992) Are there basic emotions? Psychol Rev 99(3):550–553

    Article  Google Scholar 

  29. Ekman P, Friesen W (1969) The repertoire of nonverbal behavior: categories, origins, usage, and coding. Semiotica 1(1):49–98

    Google Scholar 

  30. Fast J (1988). Body language. Pocket

  31. Faul F, Erdfelder E, Lang A, Buchner A (2007) G* power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav Res Methods 39(2):175–191

    Article  Google Scholar 

  32. Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42(3):143–166

    Article  MATH  Google Scholar 

  33. Gasser T, Kneip A (1995) Searching for structure in curve samples. J Am Stat Assoc 90(432):1179–1188

    MATH  Google Scholar 

  34. Harada T, Taoka S, Mori T, Sato T (2004) Quantitative evaluation method for pose and motion similarity based on human perception. In: 4th IEEE/RAS international conference on humanoid robots, vol 1, pp 494–512

    Chapter  Google Scholar 

  35. Heider F, Simmel M (1944) An experimental study of apparent behavior. Am J Psychol 57(2):243–259

    Article  Google Scholar 

  36. Hietanen J, Leppänen J, Lehtonen U (2004) Perception of emotions in the hand movement quality of Finnish sign language. J Nonverbal Behav 28(1):53–64

    Article  Google Scholar 

  37. Hodgins J, O’Brien J, Tumblin J (1998) Perception of human motion with different geometric models. IEEE Trans Vis Comput Graph 4(4):307–316

    Article  Google Scholar 

  38. Iba S, Weghe J, Paredis C, Khosla P (1999) An architecture for gesture-based control of mobile robots. In: Proceedings of IEEE/RDJ international conference on intelligent robots and systems (IROS’99), vol 2. IEEE Press, New York, pp 851–857

    Google Scholar 

  39. Inamura T, Toshima I, Tanie H, Nakamura Y (2004) Embodied symbol emergence based on mimesis theory. Int J Robot Res 23(4–5):363

    Article  Google Scholar 

  40. Ivanenko Y, Cappellini G, Dominici N, Poppele R, Lacquaniti F (2005) Coordination of locomotion with voluntary movements in humans. J Neurosci 25(31):7238–7253

    Article  Google Scholar 

  41. James W (1884) What is an emotion? Mind 9(34):188–205

    Article  Google Scholar 

  42. Jansen T (2007) The great pretender. OIO Publishers

  43. Jenkins OC, Matarić MJ (2004) A spatio-temporal extension to Isomap nonlinear dimension reduction. In: Proceedings of the 21st international conference on machine learning (ICML’04). ACM, New York, pp 441–448

    Google Scholar 

  44. Jolliffe I, MyiLibrary (2002) Principal component analysis, vol 2. Wiley Online Library

  45. Krauss R, Hadar U (1999) The role of speech-related arm/hand gestures in word retrieval. Gesture, speech, and sign pp 93–116

  46. Kulić D, Croft E (2007) Physiological and subjective responses to articulated robot motion. Robotica 25(1):13–27

    Article  Google Scholar 

  47. Kulić D, Takano W, Nakamura Y (2008) Incremental learning, clustering and hierarchy formation of whole body motion patterns using adaptive hidden Markov chains. Int J Robot Res 27(7):761–784

    Article  Google Scholar 

  48. Laban R, Lawrence F (1947) Effort. Macdonald and Evans

  49. Lee E (1982) A simplified b-spline computation routine. Computing 29(4):365–371

    Article  MathSciNet  MATH  Google Scholar 

  50. Lee J, Chai J, Reitsma P, Hodgins J, Pollard N (2002) Interactive control of avatars animated with human motion data. ACM Trans Graph 21(3):491–500

    Google Scholar 

  51. Lee J, Park J, Nam T (2007) Emotional interaction through physical movement. Hum-Comput Interact 3:401–410

    Google Scholar 

  52. Lewis M (1995) Self-conscious emotions. Am Sci 83(1):68–78

    Google Scholar 

  53. Losch M, Schmidt-Rohr S, Knoop S, Vacek S, Dillmann R (2007) Feature set selection and optimal classifier for human activity recognition. In: The 16th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE Press, New York, pp 1022–1027

    Chapter  Google Scholar 

  54. Lu G, Shark L, Hall G, Zeshan U (2009) Dynamic hand gesture tracking and recognition for real-time immersive virtual object manipulation. In: International conference on CyberWorlds. IEEE Press, New York, pp 29–35

    Chapter  Google Scholar 

  55. McDonnell R, Jörg S, McHugh J, Newell F, O’Sullivan C (2008) Evaluating the emotional content of human motions on real and virtual characters. In: Proceedings of the 5th symposium on applied perception in graphics and visualization. ACM, New York, pp 67–74

    Chapter  Google Scholar 

  56. Measurand (2009) Motion capture systems. http://www.measurand.com

  57. Mitra S, Acharya T (2007) Gesture recognition: A survey. IEEE Trans Syst Man Cybern, Part C, Appl Rev 37(3):311–324

    Article  Google Scholar 

  58. Müller M, Röder T, Clausen M (2005) Efficient content-based retrieval of motion capture data. ACM Trans Graph 24(3):677–685

    Article  Google Scholar 

  59. Nakanishi J, Morimoto J, Endo G, Cheng G, Schaal S, Kawato M (2004) Learning from demonstration and adaptation of biped locomotion. Robot Auton Syst 47(2–3):79–91

    Article  Google Scholar 

  60. Ogata T, Sugano S, Tani J (2005) Open–end human–robot interaction from the dynamical systems perspective: mutual adaptation and incremental learning. Adv Robot 19(6):651–670

    Article  Google Scholar 

  61. Ong S, Ranganath S (2005) Automatic sign language analysis: A survey and the future beyond lexical meaning. IEEE Trans Pattern Anal Mach Intell 27(6):873–891

    Article  Google Scholar 

  62. Perani D, Fazio F, Borghese N, Tettamanti M, Ferrari S, Decety J, Gilardi M (2001) Different brain correlates for watching real and virtual hand actions. NeuroImage 14(3):749–758

    Article  Google Scholar 

  63. Picard R (2000) Affective computing. MIT Press, Cambridge

    Google Scholar 

  64. Pollick F, Paterson H, Bruderlin A, Sanford A (2001) Perceiving affect from arm movement. Cognition 82(2):B51–B61

    Article  Google Scholar 

  65. Ramsay J (1997) Functional data analysis, 2nd edn. Springer, New York

    MATH  Google Scholar 

  66. Ramsay J (2008) Functional data analysis software. http://www.psych.mcgill.ca/misc/fda/software.html

  67. Ramsay J, Hooker G, Graves S (2009) Functional data analysis with R and MATLAB. Springer, Berlin

    Book  MATH  Google Scholar 

  68. Reisenzein R (1994) Pleasure-arousal theory and the intensity of emotions. J Pers Soc Psychol 67(3):525–539

    Article  Google Scholar 

  69. Roether C, Omlor L, Christensen A, Giese M (2009) Critical features for the perception of emotion from gait. J Vis 9(6):1–32

    Article  Google Scholar 

  70. Russell J (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161–1178

    Article  Google Scholar 

  71. Saerbeck M, Bartneck C (2010) Perception of affect elicited by robot motion. In: International conference on human-robot interaction. ACM, New York, pp 53–60

    Google Scholar 

  72. Samadani A (2011) Questionnaire’s videos. https://ece.uwaterloo.ca/~asamadan/JulyVideos.htm

  73. Samadani A, DeHart B, Robinson K, Kulić D, Kubica E, Gorbet R (2011) A study of human performance in recognizing expressive hand movements. In: RO-MAN 2011, pp 93–100, IEEE Press, New York

    Chapter  Google Scholar 

  74. Santello M, Flanders M, Soechting J (2002) Patterns of hand motion during grasping and the influence of sensory guidance. J Neurosci 22(4):1426–1435

    Google Scholar 

  75. Shaver P, Schwartz J, Kirson D, O’Connor C (1987) Emotion knowledge: further exploration of a prototype approach. J Pers Soc Psychol 52(6):1061–1086

    Article  Google Scholar 

  76. Shibata T, Yoshida M, Yamato J (1997) Artificial emotional creature for human-machine interaction. In: IEEE international conference on systems, man, and cybernetics. Computational cybernetics and simulation. IEEE Press, New York, vol 3, pp 2269–2274

    Chapter  Google Scholar 

  77. Smith L, Breazeal C (2007) The dynamic lift of developmental process. Dev Sci 10(1):61–68

    Article  Google Scholar 

  78. Spiegel J, Machotka P (1974) Messages of the body. Free Press, New York

    Google Scholar 

  79. SPSS (2010) Spss for windows, rel. 19.0. Chicago: SPSS inc

  80. Thomsen M (2007) Metabolistic architecture. In: Responsive textile environments? Tuns Press

  81. Urtasun R, Fleet D, Fua P (2006) Temporal motion models for monocular and multiview 3D human body tracking. Comput Vis Image Underst 104(2):157–177

    Article  Google Scholar 

  82. Wallbott H (1998) Bodily expression of emotion. Eur J Soc Psychol 28(6):879–896

    Article  Google Scholar 

  83. Wanderley M, Depalle P (2004) Gestural control of sound synthesis. Proc IEEE 92(4):632–644

    Article  Google Scholar 

  84. Weerdesteijn J, Desmet P, Gielen M (2005) Moving design: to design emotion through movement. Design J 8(1):28–40

    Article  Google Scholar 

Download references

Acknowledgements

The work was funded in part by the Natural Sciences and Engineering Research Council of Canada and the Canada Council for the Arts, through the collaborative New Media Initiative. The authors also wish to thank the anonymous reviewers and Stefanie Blain-Moraes for their careful reading of the manuscript. Their detailed suggestions significantly improved the quality and readability of the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ali-Akbar Samadani.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Samadani, AA., Kubica, E., Gorbet, R. et al. Perception and Generation of Affective Hand Movements. Int J of Soc Robotics 5, 35–51 (2013). https://doi.org/10.1007/s12369-012-0169-4

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-012-0169-4

Keywords

Navigation