Skip to main content
Log in

Automatic Affect Perception Based on Body Gait and Posture: A Survey

  • Survey
  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

There has been a growing interest in machine-based recognition of emotions from body gait and its combination with other modalities. In order to highlight the major trends and state of the art in this area, the literature dealing with machine-based human emotion perception through gait and posture is explored. Initially the effectiveness of human intellect and intuition in perceiving emotions in a range of cultures is examined. Subsequently, major studies in machine-based affect recognition are reviewed and their performance is compared. The survey concludes by critically analysing some of the issues raised in affect recognition using gait and posture, and identifying gaps in the current understanding in this area.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

References

  1. de Gelder B (2009) Why bodies? Twelve reasons for including bodily expressions in affective neuroscience. Philos Trans R Soc B Biol Sci 364:3475–3484

    Article  Google Scholar 

  2. Kale A, Sundaresan A, Rajagopalan A, Cuntoor NP, Roy-Chowdhury AK, Kruger V et al (2004) Identification of humans using gait. Image Process IEEE Trans 13:1163–1173

    Article  Google Scholar 

  3. Van Der Zee S, Poppe R, Taylor P, Anderson R (2015) To freeze or not to freeze: A motion-capture approach to detecting deceit. In: Proceedings of the Hawaii international conference on system sciences, Kauai, HI

  4. Alaqtash M, Sarkodie-Gyan T, Yu H, Fuentes O, Brower R, Abdelgawad A (2011) Automatic classification of pathological gait patterns using ground reaction forces and machine learning algorithms. In: Engineering in medicine and biology society, EMBC. Annual international conference of the IEEE, pp 453–457

  5. Walk RD, Walters KL (1988) Perception of the smile and other emotions of the body and face at different distances. Bull Psychon Soc 26:510–510

    Google Scholar 

  6. Kleinsmith A, Bianchi-Berthouze N (2013) Affective body expression perception and recognition: a survey. Affect Comput IEEE Trans 4:15–33

    Article  Google Scholar 

  7. Ekman P, Friesen WV (1969) Nonverbal leakage and clues to deception. Psychiatry 32:88–106

    Article  Google Scholar 

  8. Karg M, Kuhnlenz K, Buss M (2010) Recognition of affect based on gait patterns. Syst Man Cybern Part B Cybern IEEE Trans 40:1050–1061

    Article  Google Scholar 

  9. Tajadura-Jiménez A, Basia M, Deroy O, Fairhurst M, Marquardt N, Bianchi-Berthouze N (2015) As light as your footsteps: altering walking sounds to change perceived body weight, emotional state and gait. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems, pp 2943–2952

  10. Zeng Z, Pantic M, Roisman GI, Huang TS (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. Pattern Anal Mach Intell IEEE Trans 31:39–58

    Article  Google Scholar 

  11. Karg M, Samadani AA, Gorbet R, Kuhnlenz K, Hoey J, Kulic D (2013) Body movements for affective expression: a survey of automatic recognition and generation. Affect Comput IEEE Trans 4:341–359

    Article  Google Scholar 

  12. Zacharatos H, Gatzoulis C, Chrysanthou YL (2014) Automatic emotion recognition based on body movement analysis: a survey. Comput Gr Appl IEEE 34:35–45

    Article  Google Scholar 

  13. McColl D, Hong A, Hatakeyama N, Nejat G, Benhabib B (2016) A survey of autonomous human affect detection methods for social robots engaged in natural HRI. J Intell Robot Syst 82:101–133

    Article  Google Scholar 

  14. Kozlowski LT, Cutting JE (1977) Recognizing the sex of a walker from a dynamic point-light display. Percept Psychophys 21:575–580

    Article  Google Scholar 

  15. Cutting JE, Kozlowski LT (1977) Recognizing friends by their walk: gait perception without familiarity cues. Bull Psychon Soc 9:353–356

    Article  Google Scholar 

  16. Brownlow S, Dixon AR, Egbert CA, Radcliffe RD (1997) Perception of movement and dancer characteristics from point-light displays of dance. Psychol Rec 47:411

    Article  Google Scholar 

  17. Demeijer M (1989) The contribution of general features of body movement to the attribution of emotions. J Nonverbal Behav 13:247–268 (Win)

    Article  Google Scholar 

  18. Wallbott HG (1998) Bodily expression of emotion. Eur J Soc Psychol 28:879–896

    Article  Google Scholar 

  19. Coulson M (2004) Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J Nonverbal Behav 28:117–139

    Article  Google Scholar 

  20. Pollick FE, Paterson HM, Bruderlin A, Sanford AJ (2001) Perceiving affect from arm movement. Cognition 82:B51–B61

    Article  Google Scholar 

  21. Dittrich WH, Troscianko T, Lea SE, Morgan D (1996) Perception of emotion from dynamic point-light displays represented in dance. Perception 25:727–738

    Article  Google Scholar 

  22. de Gelder B, Van den Stock J, Meeren HK, Sinke CB, Kret ME, Tamietto M (2010) Standing up for the body. Recent progress in uncovering the networks involved in the perception of bodies and bodily expressions. Neurosci Biobehav Rev 34:513–27

    Article  Google Scholar 

  23. Schneider S, Christensen A, Haussinger FB, Fallgatter AJ, Giese MA, Ehlis AC (2014) Show me how you walk and I tell you how you feel—a functional near-infrared spectroscopy study on emotion perception based on human gait. Neuroimage 85(Pt 1):380–90

    Article  Google Scholar 

  24. Ekman P, Friesen WV (1971) Constants across cultures in the face and emotion. J Pers Soc Psychol 17:124–129

    Article  Google Scholar 

  25. Crivelli C, Jarillo S, Russell JA, Fernandez-Dols JM (2016) Reading emotions from faces in two indigenous societies. J Exp Psychol Gen 145:830–43

    Article  Google Scholar 

  26. Kleinsmith A, De Silva PR, Bianchi-Berthouze N (2006) Cross-cultural differences in recognizing affect from body posture. Interact Comput 18:1371–1389

    Article  Google Scholar 

  27. Elfenbein HA (2015) In-group advantage and other-group bias in facial emotion recognition. In: Understanding facial expressions in communication: cross-cultural and multidisciplinary perspectives ed, pp 57–71

  28. Quiros-Ramirez MA (2015) Considering cross-cultural context in the automatic recognition of emotions. Int J Mach Learn Cybernet 6:119–127

    Article  Google Scholar 

  29. Zen G, Porzi L, Sangineto E, Ricci E, Sebe N (2016) Learning personalized models for facial expression analysis and gesture recognition. IEEE Trans Multimed 18:775–788

    Article  Google Scholar 

  30. Wilson PA, Lewandowska-Tomaszczyk B (2014) Affective robotics: modelling and testing cultural prototypes. Cognit Comput 6:814–840

    Article  Google Scholar 

  31. Atkinson AP, Dittrich WH, Gemmell AJ, Young AW (2004) Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33:717–746

    Article  Google Scholar 

  32. Gross MM, Crane EA, Fredrickson BL (2012) Effort-shape and kinematic assessment of bodily expression of emotion during gait. Hum Mov Sci 31:202–221

    Article  Google Scholar 

  33. Nayak N, Sethi R, Song B, Roy-Chowdhury A (2011) Motion pattern analysis for modeling and recognition of complex human activities. Guide to video analysis of humans: looking at people

  34. Lankes M, Bernhaupt R, Tscheligi M (2010) Evaluating user experience factors using experiments: expressive artificial faces embedded in contexts. In: Bernhaupt R (ed) Evaluating user experience in games: concepts and methods. Springer, London, pp 165–183

    Chapter  Google Scholar 

  35. Buisine S, Courgeon M, Charles A, Clavel C, Martin J-C, Tan N et al (2014) The role of body postures in the recognition of emotions in contextually rich scenarios. Int J Hum Comput Interact 30:52–62

    Article  Google Scholar 

  36. Willis ML, Palermo R, Burke D (2011) Judging approachability on the face of it: the influence of face and body expressions on the perception of approachability. Emotion 11:514–23

    Article  Google Scholar 

  37. Kret ME, de Gelder B (2010) Social context influences recognition of bodily expressions. Exp Brain Res 203 169–180

  38. Van den Stock J, Vandenbulcke M, Sinke CB, de Gelder B (2014) Affective scenes influence fear perception of individual body expressions. Hum Brain Mapp 35:492–502

    Article  Google Scholar 

  39. Kret ME, Roelofs K, Stekelenburg JJ, de Gelder B (2013) Emotional signals from faces, bodies and scenes influence observers’ face expressions, fixations and pupil-size. Front Hum Neurosci 7:810. doi:10.3389/fnhum.2013.00810

  40. Muller PM, Amin S, Verma P, Andriluka M, Bulling A (2015) Emotion recognition from embedded bodily expressions and speech during dyadic interactions. In: 2015 International conference on affective computing and intelligent interaction, ACII 2015, pp 663–669

  41. Kapur A, Kapur A, Virji-Babul N, Tzanetakis G, Driessen PF (2005) Gesture-based affective computing on motion capture data. In: Tao J, Picard RW (eds) Affective computing and intelligent interaction, proceedings. vol 3784. Springer, Berlin, pp 1–7

  42. Lim A, Okuno HG (2014) The MEI eobot: towards using motherese to develop multimodal emotional intelligence. IEEE Trans Auton Ment Dev 6:126–138

    Article  Google Scholar 

  43. Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B, Grisel O et al (2011) Scikit-learn: machine learning in python. J Mach Learn Res 12:2825–2830

    MathSciNet  MATH  Google Scholar 

  44. Bianchi-Berthouze N, Kleinsmith A (2003) A categorical approach to affective gesture recognition. Connect Sci 15:259–269

    Article  Google Scholar 

  45. XSens (2013) MVN user manual document MV0319P, Revision H (ed). www.xsens.com

  46. Garber-Barron M, Mei S (2012) Using body movement and posture for emotion detection in non-acted scenarios. In: Fuzzy systems (FUZZ-IEEE). IEEE international conference on 2012, pp 1–8

  47. Kleinsmith A, Bianchi-Berthouze N, Steed A (2011) Automatic recognition of non-acted affective postures. Syst Man Cybern Part B Cybern IEEE Trans 41:1027–1038

    Article  Google Scholar 

  48. Janssen D, Schollhorn WI, Lubienetzki J, Folling K, Kokenge H, Davids K (2008) Recognition of emotions in gait patterns by means of artificial neural nets. J Nonverbal Behav 32:79–92

    Article  Google Scholar 

  49. Fawver B, Beatty GF, Naugle KM, Hass CJ, Janelle CM (2015) Emotional state impacts center of pressure displacement before forward gait initiation. J Appl Biomech 31:35–40

    Article  Google Scholar 

  50. Giraud T, Jáuregui DAG, Hua J, Isableu B, Filaire E, Scanff CL et al (2013) Assessing postural control for affect recognition using video and force plates. In: Proceedings—2013 Humaine association conference on affective computing and intelligent interaction, ACII 2013, pp 109–115

  51. Shotton J, Sharp T, Kipman A, Fitzgibbon A, Finocchio M, Blake A et al (2013) Real-time human pose recognition in parts from single depth images. Commun ACM 56:116–124

    Article  Google Scholar 

  52. Xiao Y, Yuan J, Thalmann D (2013) Human-virtual human interaction by upper body gesture understanding. In: Proceedings of the 19th ACM symposium on virtual reality software and technology, pp 133–142

  53. Li S, Cui L, Zhu C, Li B, Zhao N, Zhu T (2016) Emotion recognition using Kinect motion capture data of human gaits. PeerJ 4:e2364

    Article  Google Scholar 

  54. Xu J, Sakazawa S (2014) Temporal fusion approach using segment weight for affect recognition from body movements. In: 2014 ACM conference on multimedia, MM 2014, pp 833–836

  55. Bernhardt D, Robinson P (2007) Detecting affect from non-stylised body motions. In: 2nd International conference on affective computing and intelligent interaction, ACII 2007 vol 4738 LNCS (ed). Lisbon, pp 59–70

  56. Sanghvi J, Castellano G, Leite I, Pereira A, McOwan PW, Paiva A (2011) Automatic analysis of affective postures and body motion to detect engagement with a game companion. In: Human–robot interaction (HRI), 2011 6th ACM/IEEE international conference, pp 305–311

  57. Laban R (1956) Principles of dance and movement notation. Macdonald & Evans, New York

    Google Scholar 

  58. Hachimura K, Takashina K, Yoshimura M (2005) Analysis and evaluation of dancing movement based on LMA. In: Robot and human interactive communication, 2005. ROMAN 2005. IEEE international workshop, pp 294–299

  59. Zacharatos H, Gatzoulis C, Chrysanthou Y, Aristidou A (2013) Emotion recognition for exergames using Laban movement analysis. In: 6th International conference on motion in games, MIG 2013, Dublin, pp 39–43

  60. Fourati N, Pelachaud C (2015) Multi-level classification of emotional body expression. In: 2015 11th IEEE international conference and workshops on automatic face and gesture recognition, FG

  61. Woo Hyun K, Jeong Woo P, Won Hyong L, Myung Jin C, Hui Sung L (2013) LMA based emotional motion representation using RGB-D camera. In: Human–robot interaction (HRI), 2013 8th ACM/IEEE international conference, pp 163–164

  62. McColl D, Nejat G, Ieee (2014) Determining the affective body language of older adults during socially assistive HRI.2014 Ieee/Rsj international conference on intelligent robots and systems (Iros 2014), pp 2633–2638

  63. McColl D, Jiang C, Nejat G (2016) Classifying a Person’s degree of accessibility from natural body language during social human–robot interactions. IEEE Trans Cybern PP:1–15

    Article  Google Scholar 

  64. Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. SIGKDD Explor Newsl 11:10–18

    Article  Google Scholar 

  65. Piana S, Staglian A, Odone F, Camurri A (2016) Adaptive body gesture representation for automatic emotion recognition. ACM Trans Interact Intell Syst 6:1–31

    Article  Google Scholar 

  66. Senecal S, Cuel L, Aristidou A, Magnenat-Thalmann N (2016) Continuous body emotion recognition system during theater performances. Comput Anim Virtual Worlds 27:311–320

    Article  Google Scholar 

  67. Kaza K, Psaltis A, Stefanidis K, Apostolakis KC, Thermos S, Dimitropoulos K, Daras P (2016) Body motion analysis for emotion recognition in serious games. In: International Conference on Universal Access in Human-Computer Interaction, July. Springer, pp 33–42

  68. Arunnehru J, Geetha MK (2017) Automatic human emotion recognition in surveillance video. In: Intelligent Techniques in Signal Processing for Multimedia Security. Springer, pp 321–342

  69. Park H, Park JII, Kim UM, Woo N (2004) Emotion recognition from dance image sequences using contour approximation. In: Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics) vol 3138 (ed), pp 547–555

  70. Barakova EI, Lourens T (2010) Expressing and interpreting emotional movements in social games with robots. Pers Ubiquit Comput 14:457–467

    Article  Google Scholar 

  71. Lourens T, van Berkel R, Barakova E (2010) Communicating emotions and mental states to robots in a real time parallel framework using Laban movement analysis. Robot Auton Syst 58:1256–1265

    Article  Google Scholar 

  72. Samadani AA, Ghodsi A, Kulic D (2013) Discriminative functional analysis of human movements. Pattern Recogn Lett 34:1829–1839

    Article  Google Scholar 

  73. Venture G, Kadone H, Zhang TX, Grezes J, Berthoz A, Hicheur H (2014) Recognizing emotions conveyed by human gait. Int J Soc Robot 6:621–632

    Article  Google Scholar 

  74. Kar R, Chakraborty A, Konar A, Janarthanan R (2013) Emotion recognition system by gesture analysis using fuzzy sets. In: Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics) vol 8298 LNCS (ed), pp 354–363

  75. Samadani AA, Gorbet R, Kulic D (2014) Affective movement recognition based on generative and discriminative stochastic dynamic models. Hum Mach Syst IEEE Trans 44:454–467

    Article  Google Scholar 

  76. D’mello SK, Kory J (2015) A review and meta-analysis of multimodal affect detection systems. ACM Comput Surv 47:1–36

    Article  Google Scholar 

  77. Gunes H, Piccardi M (2005) Fusing face and body gesture for machine recognition of emotions. In: Robot and human interactive communication 2005. ROMAN 2005. IEEE International workshop on 2005, pp 306–311

  78. Gunes H, Piccardi M (2009) Automatic temporal segment detection and affect recognition from face and body display. Syst Man Cybern Part B Cybern IEEE Trans 39:64–84

    Article  Google Scholar 

  79. Shan C, Gong S, McOwan PW (2007) Beyond facial expressions: learning human emotion from body gestures. In: BMVC, 2007, pp 1–10

  80. Shizhi C, YingLi T, Qingshan L, Metaxas DN (2011) Recognizing expressions from face and body gesture by temporal normalized motion and appearance features. In: Computer vision and pattern recognition workshops (CVPRW). IEEE computer society conference on 2011, pp 7–12

  81. Shizhi C, YingLi T (2013) Margin-constrained multiple kernel learning based multi-modal fusion for affect recognition. In: Automatic face and gesture recognition (FG), 2013 10th IEEE international conference and workshops on 2013, pp 1–7

  82. Kessous L, Castellano G, Caridakis G (2010) Multimodal emotion recognition in speech-based interaction using facial expression, body gesture and acoustic analysis. J Multimodal User Interfaces 3:33–48

    Article  Google Scholar 

  83. Camurri A, Coletta P, Massari A, Mazzarino B, Peri M, Ricchetti M et al (2004) Toward real-time multimodal processing: EyesWeb 4.0. In: Proceedings of the artificial intelligence and the simulation of behaviour (AISB), 2004 convention: motion. Emotion and cognition 2004, pp 22–26

  84. Calvo RA, Mello SD (2010) Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans Affect Comput 1:18–37

    Article  Google Scholar 

  85. Russell JA (2003) Core affect and the psychological construction of emotion. Psychol Rev 110:145–72

    Article  Google Scholar 

  86. Lewis M, Cañamero L (2013) Are discrete emotions useful in human-robot interaction? Feedback from motion capture analysis. In: Proceedings—2013 Humaine association conference on affective computing and intelligent interaction, ACII 2013, pp 97–102

  87. Matthias Rehm AK, Segato N (2015) Perception of affective body movements in HRI across age groups: comparison between results from Denmark and Japan, pp 25–32

  88. Lisin DA, Mattar MA, Blaschko MB, Learned-Miller EG, Benfield MC (2005) Combining local and global image features for object class recognition. In: Computer vision and pattern recognition-workshops, 2005. CVPR workshops. IEEE Computer society conference on 2005, pp 47–47

  89. Wang L, Zhou H, Low SC, Leckie C (2009) Action recognition via multi-feature fusion and Gaussian process classification. In: 2009 Workshop on applications of computer vision, WACV 2009. Snowbird, UT

  90. Yu H, Liu H (2015) Combining appearance and geometric features for facial expression recognition. In: 6th International conference on graphic and image processing, ICGIP 2014

Download references

Acknowledgements

This research has been conducted with the support of the Australian Government Research Training Program Scholarship.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Benjamin Stephens-Fripp.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Stephens-Fripp, B., Naghdy, F., Stirling, D. et al. Automatic Affect Perception Based on Body Gait and Posture: A Survey. Int J of Soc Robotics 9, 617–641 (2017). https://doi.org/10.1007/s12369-017-0427-6

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-017-0427-6

Keywords

Navigation