Skip to main content
Log in

Interpretation of Emotional Body Language Displayed by a Humanoid Robot: A Case Study with Children

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

The work reported in this paper focuses on giving humanoid robots the capacity to express emotions with their body. Previous results show that adults are able to interpret different key poses displayed by a humanoid robot and also that changing the head position affects the expressiveness of the key poses in a consistent way. Moving the head down leads to decreased arousal (the level of energy) and valence (positive or negative emotion) whereas moving the head up produces an increase along these dimensions. Hence, changing the head position during an interaction should send intuitive signals. The study reported in this paper tested children’s ability to recognize the emotional body language displayed by a humanoid robot. The results suggest that body postures and head position can be used to convey emotions during child-robot interaction.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. http://www.aldebaran-robotics.com/en.

References

  1. Argyle M (1975) Bodily communication, Methuen, London

  2. Atkinson AP, Dittrich WH, Gemmell AJ, Young AW (2004) Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33(6):717–746

    Article  Google Scholar 

  3. Avizer H, Trope Y, Todorov A (2012) Body cues, not facial expressions, discriminate between intense positive and negative emotions. Science 338:1225–1229

    Article  Google Scholar 

  4. Beck A (2012) Perception of emotional body language displayed by animated characters. PhD thesis, University of Portsmouth, UK

  5. Beck A, Cañamero L, Bard K (2010) Towards an affect space for robots to display emotional body language. In: Ro-Man 2010. IEEE Press, New York, pp 464–469

    Google Scholar 

  6. Beck A, Stevens B, Bard KA, Cañamero L (2012) Emotional body language displayed by artificial agents. ACM Trans Interact Intell Syst 2(1):2:1–2:29

    Article  Google Scholar 

  7. Belpaeme T, Baxter P, Read R, Wood R, Cuayáhuitl H, Kiefer B, Racioppa S, Kruijff-Korbayová I, Athanasopoulos G, Enescu V, Looije R, Neerincx M, Demiris Y, Ros-Espinoza R, Beck A, Cañamero L, Hiolle A, Lewis M, Baroni I, Nalin M, Cosi P, Paci G, Tesser F, Sommavilla G, Humbert R (2013) Multimodal child-robot interaction: building social bonds. J Human-Robot Interact 1(2):33–53

    Google Scholar 

  8. Bethel C, Murphy R (2008) Survey of non-facial/non-verbal affective expressions for appearance-constrained robots. IEEE Trans Syst Man Cybern, Part C, Appl Rev 38(1):83–92

    Article  Google Scholar 

  9. Boone R, Cunningham J (1996) The attribution of emotion to expressive body movements: a structural cue analysis. Manuscript

  10. Boone R, Cunningham J (1998) Children’s decoding of emotion in expressive body movement: the development of cue attunement. Dev Psychol 34(5):579–589

    Article  Google Scholar 

  11. Breazeal C (2002) Designing sociable robots. Intelligent robotics and autonomous agents. MIT Press, Cambridge

    Google Scholar 

  12. Cassell J (2000) Nudge nudge wink wink: elements of face-to-face conversation. MIT Press, Cambridge, pp 1–27

    Google Scholar 

  13. De Silva P, Bianchi-Berthouze N (2004) Modeling human affective postures: an information theoretic characterization of posture features. Comput Animat Virtual Worlds 15(3–4):269–276

    Article  Google Scholar 

  14. Elfenbein H, Ambady N (2002) On the universality and cultural specificity of emotion recognition: a meta-analysis. Psychol Bull 128:203–235

    Article  Google Scholar 

  15. de Gelder B (2006) Towards the neurobiology of emotional body language. Nat Rev 7(3):242–249

    Article  Google Scholar 

  16. Haring M, Bee N, Andre E (2011) Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In: Ro-Man 2011. IEEE Press, New York, pp 204–209

    Chapter  Google Scholar 

  17. Itoh K, Miwa H, Matsumoto M, Zecca M, Takanobu H, Roccella S, Carrozza M, Dario P, Takanishi A (2004) Various emotional expressions with emotion expression humanoid robot we-4rii. In: First IEEE technical exhibition based conference on robotics and automation, TExCRA’04, pp 35–36.

    Chapter  Google Scholar 

  18. Kleinsmith A, Bianchi-Berthouze N (2013) Affective body expression perception and recognition: a survey. IEEE Trans Affect Comput 4(1):15–33. doi:10.1109/TAFFC.2012.16

    Article  Google Scholar 

  19. Kleinsmith A, Bianchi-Berthouze N, Steed A (2011) Automatic recognition of non-acted affective postures. IEEE Trans Syst Man Cybern, Part B, Cybern 41(4):1027–1038. doi:10.1109/TSMCB.2010.2103557

    Article  Google Scholar 

  20. Kleinsmith A, De Silva PR, Bianchi-Berthouze N (2006) Cross-cultural differences in recognizing affect from body posture. Interact Comput 18(6):1371–1389

    Article  Google Scholar 

  21. Knapp M, Hall J (1972) Non verbal communication in human interaction. Harcout Brace College

  22. Lewis M (2000) The emergence of human emotions. In: Lewis M, Haviland-Jones J (eds) Handbook of emotions. Guilford, New York

    Google Scholar 

  23. Maestri G (2006) In: Kissane E, Kalning K (eds) Digital character animation

    Google Scholar 

  24. Marc C (2004) Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J Nonverbal Behav 28:117–139

    Article  Google Scholar 

  25. Nalin M, Baroni I, Sanna A, Pozzi C (2012) Robotic companion for diabetic children: emotional and educational support to diabetic children, through an interactive robot. In: Proceedings of the 11th international conference on interaction design and children, IDC’12. ACM, New York, pp 260–263

    Chapter  Google Scholar 

  26. Niedenthal P, Krauth-Gruber S, Ric F (2006) Psychology of emotion. Interpersonal, experiential, and cognitive approaches. Psychology Press, New York

    Google Scholar 

  27. Pollick F (2001) Perceiving affect from arm movement. Cognition 82(2):51–61

    Article  Google Scholar 

  28. Roether CL, Omlor L, Christensen A, Giese MA (2009) Critical features for the perception of emotion from gait. J Vis 9(6):15. doi:10.1167/9.6.15

    Article  Google Scholar 

  29. Russell JA (1980) A circumflex model of affect. J Pers Soc Psychol 39:1161–1178

    Article  Google Scholar 

  30. Saerbeck M, Bartneck C (2010) Perception of affect elicited by robot motion. In: Proceedings of the 5th ACM/IEEE international conference on human-robot interaction, HRI’10. IEEE Press, Piscataway, pp 53–60

    Google Scholar 

  31. Schouwstra S, Hoogstraten J (1995) Head position and spinal position as determinants of perceived emotional state. Percept Mot Skills 81(2):673–674

    Article  Google Scholar 

  32. Thomas F, Johnston O (1995) The illusion of life. Abbeville Press, New York

    Google Scholar 

  33. Tonk J (2007) Assessing emotion recognition in 9–15-years olds: preliminary analysis of abilities in reading emotion from faces, voices and eyes. Brain Inj 21(6):623–629

    Article  Google Scholar 

  34. Vinayagamoorthy V, Gillies M, Steed A, Tanguy E, Pan X, Loscos C, Slater M (2006) Building expression into virtual characters. In: Proceedings of the Eurographics

    Google Scholar 

  35. Wallbott H (1998) Bodily expression of emotion. Eur J Soc Psychol 28(6):879–896

    Article  Google Scholar 

  36. Walter M, Dautenhahn K, Te Boekhorst R, Koay K, Syrdal D, Nehaniv C (2009) An empirical framework for human-robot proxemics. In: Symposium on new frontiers in human-robot interaction, AISB09, pp 144–149

    Google Scholar 

  37. Woods S, Dautenhahn K, Schultz J (2005) Child and adults’ perspectives on robot appearance. In: Symposium on robot companion, AISB05, pp 126–132

    Google Scholar 

  38. Nalin M, Baroni I, Kruijff-Korbayova I, Canamero L, Lewis M, Beck A, Cuayahuitl H, Sanna A (2012) Children’s adaptation in multisession interaction with a humanoid robot. In: Ro-Man 2012. IEEE Press, New York, pp 351–357. doi:10.1109/ROMAN.2012.6343778

    Google Scholar 

Download references

Acknowledgements

The authors would like to thank the school “scuola media Dante Alighieri” for hosting the study as well as Arnaud Ducamp and Cornelius Glackin for their feedback on an earlier version of this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Aryel Beck.

Additional information

This work is funded by the EU FP7 ALIZ-E project (grant number 248116). This paper is an extended and improved version of Beck, A., Cañamero, L., Damiano, L., Sommavilla, G., Tesser, F., Cosi, P.: Children Interpretation of Emotional Body Language Displayed by a Robot. In: Proceeding of Social Robotics, Third International Conference, ICSR 2011, pp 62–70, Springer, Amsterdam, The Netherlands (2011).

Rights and permissions

Reprints and permissions

About this article

Cite this article

Beck, A., Cañamero, L., Hiolle, A. et al. Interpretation of Emotional Body Language Displayed by a Humanoid Robot: A Case Study with Children. Int J of Soc Robotics 5, 325–334 (2013). https://doi.org/10.1007/s12369-013-0193-z

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-013-0193-z

Keywords

Navigation