Abstract
The work reported in this paper focuses on giving humanoid robots the capacity to express emotions with their body. Previous results show that adults are able to interpret different key poses displayed by a humanoid robot and also that changing the head position affects the expressiveness of the key poses in a consistent way. Moving the head down leads to decreased arousal (the level of energy) and valence (positive or negative emotion) whereas moving the head up produces an increase along these dimensions. Hence, changing the head position during an interaction should send intuitive signals. The study reported in this paper tested children’s ability to recognize the emotional body language displayed by a humanoid robot. The results suggest that body postures and head position can be used to convey emotions during child-robot interaction.
Similar content being viewed by others
References
Argyle M (1975) Bodily communication, Methuen, London
Atkinson AP, Dittrich WH, Gemmell AJ, Young AW (2004) Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33(6):717–746
Avizer H, Trope Y, Todorov A (2012) Body cues, not facial expressions, discriminate between intense positive and negative emotions. Science 338:1225–1229
Beck A (2012) Perception of emotional body language displayed by animated characters. PhD thesis, University of Portsmouth, UK
Beck A, Cañamero L, Bard K (2010) Towards an affect space for robots to display emotional body language. In: Ro-Man 2010. IEEE Press, New York, pp 464–469
Beck A, Stevens B, Bard KA, Cañamero L (2012) Emotional body language displayed by artificial agents. ACM Trans Interact Intell Syst 2(1):2:1–2:29
Belpaeme T, Baxter P, Read R, Wood R, Cuayáhuitl H, Kiefer B, Racioppa S, Kruijff-Korbayová I, Athanasopoulos G, Enescu V, Looije R, Neerincx M, Demiris Y, Ros-Espinoza R, Beck A, Cañamero L, Hiolle A, Lewis M, Baroni I, Nalin M, Cosi P, Paci G, Tesser F, Sommavilla G, Humbert R (2013) Multimodal child-robot interaction: building social bonds. J Human-Robot Interact 1(2):33–53
Bethel C, Murphy R (2008) Survey of non-facial/non-verbal affective expressions for appearance-constrained robots. IEEE Trans Syst Man Cybern, Part C, Appl Rev 38(1):83–92
Boone R, Cunningham J (1996) The attribution of emotion to expressive body movements: a structural cue analysis. Manuscript
Boone R, Cunningham J (1998) Children’s decoding of emotion in expressive body movement: the development of cue attunement. Dev Psychol 34(5):579–589
Breazeal C (2002) Designing sociable robots. Intelligent robotics and autonomous agents. MIT Press, Cambridge
Cassell J (2000) Nudge nudge wink wink: elements of face-to-face conversation. MIT Press, Cambridge, pp 1–27
De Silva P, Bianchi-Berthouze N (2004) Modeling human affective postures: an information theoretic characterization of posture features. Comput Animat Virtual Worlds 15(3–4):269–276
Elfenbein H, Ambady N (2002) On the universality and cultural specificity of emotion recognition: a meta-analysis. Psychol Bull 128:203–235
de Gelder B (2006) Towards the neurobiology of emotional body language. Nat Rev 7(3):242–249
Haring M, Bee N, Andre E (2011) Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In: Ro-Man 2011. IEEE Press, New York, pp 204–209
Itoh K, Miwa H, Matsumoto M, Zecca M, Takanobu H, Roccella S, Carrozza M, Dario P, Takanishi A (2004) Various emotional expressions with emotion expression humanoid robot we-4rii. In: First IEEE technical exhibition based conference on robotics and automation, TExCRA’04, pp 35–36.
Kleinsmith A, Bianchi-Berthouze N (2013) Affective body expression perception and recognition: a survey. IEEE Trans Affect Comput 4(1):15–33. doi:10.1109/TAFFC.2012.16
Kleinsmith A, Bianchi-Berthouze N, Steed A (2011) Automatic recognition of non-acted affective postures. IEEE Trans Syst Man Cybern, Part B, Cybern 41(4):1027–1038. doi:10.1109/TSMCB.2010.2103557
Kleinsmith A, De Silva PR, Bianchi-Berthouze N (2006) Cross-cultural differences in recognizing affect from body posture. Interact Comput 18(6):1371–1389
Knapp M, Hall J (1972) Non verbal communication in human interaction. Harcout Brace College
Lewis M (2000) The emergence of human emotions. In: Lewis M, Haviland-Jones J (eds) Handbook of emotions. Guilford, New York
Maestri G (2006) In: Kissane E, Kalning K (eds) Digital character animation
Marc C (2004) Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J Nonverbal Behav 28:117–139
Nalin M, Baroni I, Sanna A, Pozzi C (2012) Robotic companion for diabetic children: emotional and educational support to diabetic children, through an interactive robot. In: Proceedings of the 11th international conference on interaction design and children, IDC’12. ACM, New York, pp 260–263
Niedenthal P, Krauth-Gruber S, Ric F (2006) Psychology of emotion. Interpersonal, experiential, and cognitive approaches. Psychology Press, New York
Pollick F (2001) Perceiving affect from arm movement. Cognition 82(2):51–61
Roether CL, Omlor L, Christensen A, Giese MA (2009) Critical features for the perception of emotion from gait. J Vis 9(6):15. doi:10.1167/9.6.15
Russell JA (1980) A circumflex model of affect. J Pers Soc Psychol 39:1161–1178
Saerbeck M, Bartneck C (2010) Perception of affect elicited by robot motion. In: Proceedings of the 5th ACM/IEEE international conference on human-robot interaction, HRI’10. IEEE Press, Piscataway, pp 53–60
Schouwstra S, Hoogstraten J (1995) Head position and spinal position as determinants of perceived emotional state. Percept Mot Skills 81(2):673–674
Thomas F, Johnston O (1995) The illusion of life. Abbeville Press, New York
Tonk J (2007) Assessing emotion recognition in 9–15-years olds: preliminary analysis of abilities in reading emotion from faces, voices and eyes. Brain Inj 21(6):623–629
Vinayagamoorthy V, Gillies M, Steed A, Tanguy E, Pan X, Loscos C, Slater M (2006) Building expression into virtual characters. In: Proceedings of the Eurographics
Wallbott H (1998) Bodily expression of emotion. Eur J Soc Psychol 28(6):879–896
Walter M, Dautenhahn K, Te Boekhorst R, Koay K, Syrdal D, Nehaniv C (2009) An empirical framework for human-robot proxemics. In: Symposium on new frontiers in human-robot interaction, AISB09, pp 144–149
Woods S, Dautenhahn K, Schultz J (2005) Child and adults’ perspectives on robot appearance. In: Symposium on robot companion, AISB05, pp 126–132
Nalin M, Baroni I, Kruijff-Korbayova I, Canamero L, Lewis M, Beck A, Cuayahuitl H, Sanna A (2012) Children’s adaptation in multisession interaction with a humanoid robot. In: Ro-Man 2012. IEEE Press, New York, pp 351–357. doi:10.1109/ROMAN.2012.6343778
Acknowledgements
The authors would like to thank the school “scuola media Dante Alighieri” for hosting the study as well as Arnaud Ducamp and Cornelius Glackin for their feedback on an earlier version of this paper.
Author information
Authors and Affiliations
Corresponding author
Additional information
This work is funded by the EU FP7 ALIZ-E project (grant number 248116). This paper is an extended and improved version of Beck, A., Cañamero, L., Damiano, L., Sommavilla, G., Tesser, F., Cosi, P.: Children Interpretation of Emotional Body Language Displayed by a Robot. In: Proceeding of Social Robotics, Third International Conference, ICSR 2011, pp 62–70, Springer, Amsterdam, The Netherlands (2011).
Rights and permissions
About this article
Cite this article
Beck, A., Cañamero, L., Hiolle, A. et al. Interpretation of Emotional Body Language Displayed by a Humanoid Robot: A Case Study with Children. Int J of Soc Robotics 5, 325–334 (2013). https://doi.org/10.1007/s12369-013-0193-z
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-013-0193-z