Skip to main content
Log in

Can You Read My Face?

A Methodological Variation for Assessing Facial Expressions of Robotic Heads

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Our paper reports about an online study on robot facial expressions. On the one hand, we performed this study to assess the quality of the current facial expressions of two robot heads. On the other hand, we aimed at developing a simple, easy-to-use methodological variation to evaluate facial expressions of robotic heads. Short movie clips of two different robot heads showing a happy, sad, surprised, and neutral facial expression were compiled into an online survey, to examine how people interpret these expressions. Additionally, we added a control condition with a human face showing the same four emotions. The results showed that the facial expressions could be recognized well for both heads. Even the blender emotion surprised was recognized, although it resulted in positive and negative connotations. These results underline the importance of the situational context to correctly interpret emotional facial expressions. Besides the expected finding that the human is perceived significantly more anthropomorphic and animate than both robot heads, the more human-like designed robot head was rated significantly higher with respect to anthropomorphism than the robot head using animal-like features. In terms of the validation procedure, we could provide evidence for a feasible two-step procedure. By assessing the participants’ dispositional empathy with a questionnaire it can be ensured that they are in general able to decode facial expressions into the corresponding emotion. In subsequence, robot facial expressions can be validated with a closed-question approach.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. http://www.iuro-project.eu/.

  2. http://www.limesurvey.org/.

References

  1. Al Moubayed S, Beskow J, Skantze G, Granström B (2012) Furhat: a back-projected human-like robot head for multiparty human-machine interaction. In: Esposito A, Esposito A, Vinciarelli A, Hoffmann R, Müller VC (eds) Cognitive behavioural systems. Lecture notes in computer science. Springer, Berlin

    Google Scholar 

  2. Bartneck C (2003) Interacting with an embodied emotional character. In: Proceedings of the 2003 international conference on Designing pleasurable products and interfaces, ACM, pp 55–60

  3. Bartneck C, Kulić D, Croft E, Zoghbi S (2009) Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int J Soc Robot 1(1):71–81

    Article  Google Scholar 

  4. Battocchi A, Pianesi F, Goren-Bar D (2005) A first evaluation study of a database of kinetic facial expressions (dafex). In: Proceedings of the 7th international conference on multimodal interfaces, ACM, pp 214–221

  5. Battocchi A, Pianesi F, Goren-Bar D (2005) The properties of dafex, a database of kinetic facial expressions. In: Affective computing and intelligent interaction, Springer, pp 558–565

  6. Becker C, Kopp S, Wachsmuth I (2007) Why emotions should be integrated into conversational agents. Conversational informatics: an engineering approach, pp 49–68

  7. Becker-Asano C, Ishiguro H (2011) Evaluating facial displays of emotion for the android robot geminoid f. In: Affective computational intelligence (WACI), 2011 IEEE Workshop on, IEEE, pp 1–8

  8. Bennett C, Sabanovic S (2013) Perceptions of affective expression in a minimalist robotic face. In: Human-robot interaction (HRI), 2013 8th ACM/IEEE international conference on, IEEE, pp 81–82

  9. Blakemore S, Winston J, Frith U (2004) Social cognitive neuroscience: where are we heading? Trends Cogn Sci 8(5):216–222

    Article  Google Scholar 

  10. Bradley MM, Lang PJ (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatr 25(1):49–59

    Article  Google Scholar 

  11. Breazeal C (2004) Designing sociable robots. MIT press, Cambridge

    Google Scholar 

  12. Breazeal C, Kidd C, Thomaz A, Hoffman G, Berlin M (2005) Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In: Intelligent robots and systems, 2005. (IROS 2005). 2005 IEEE/RSJ international conference on, IEEE, pp 708–713

  13. van Breemen A, Yan X, Meerbeek B (2005) icat: an animated user-interface robot with personality. In: Proceedings of the fourth international joint conference on autonomous agents and multiagent systems, ACM (2005), pp 143–144

  14. Canamero L, Fredslund J (2001) I show you how i like you-can you read it in my face? [robotics]. IEEE Trans Syst Man Cybern Part A 31(5):454–459

    Article  Google Scholar 

  15. Chartrand T, Bargh J (1999) The chameleon effect: the perception-behavior link and social interaction. J Personal Soc Psychol 76(6):893

    Article  Google Scholar 

  16. Clark H (1996) Using language, vol 4. Cambridge University Press, Cambridge

    Book  Google Scholar 

  17. Clark H, Brennan S (1991) Grounding in communication. Perspect Soc shar Cogn 13(1991):127–149

    Article  Google Scholar 

  18. Dapretto M, Davies M, Pfeifer J, Scott A, Sigman M, Bookheimer S, Iacoboni M (2005) Understanding emotions in others: mirror neuron dysfunction in children with autism spectrum disorders. Nat Neurosci 9(1):28–30

    Article  Google Scholar 

  19. Delaunay F, Belpaeme T (2012) Refined human-robot interaction through retro-projected robotic heads. In: Advanced robotics and its social impacts (ARSO), 2012 IEEE workshop on, IEEE, pp 106–107

  20. Ekman P (1992) An argument for basic emotions. Cogn Emot 6(3–4):169–200

    Article  Google Scholar 

  21. Ekman P, Friesen W (1982) Felt, false, and miserable smiles. J Nonverbal Behav 6(4):238–252

    Article  Google Scholar 

  22. Fukuda T, Taguri J, Arai F, Nakashima M, Tachibana D, Hasegawa Y (2002) Facial expression of robot face for human-robot mutual communication. In: Robotics and cutomation, 2002. proceedings. ICRA’02. IEEE international conference on, vol 1. IEEE, pp 46–51

  23. Gallese V (2001) The shared manifold hypothesis from mirror neurons to empathy. J Conscious Stud 8(5–7):5–7

    Google Scholar 

  24. Gonsior B, Sosnowski S, Mayer C, Blume J, Radig B, Wollherr D, Kuhnlenz K (2011) Improving aspects of empathy and subjective performance for hri through mirroring facial expressions. In: RO-MAN, 2011 IEEE, IEEE, pp 350–356

  25. Hampson E, van Anders SM, Mullin LI (2006) A female advantage in the recognition of emotional facial expressions: test of an evolutionary hypothesis. Evol Hum Behav 27(6):401–416

    Article  Google Scholar 

  26. Hara F, Kobayashi H, Iida F, Tabata M (1998) Personality characterization of animate face robot through interactive communication with human. In: Proceedings of IARP98

  27. Hegel F, Eyssel F, Wrede B (2010) The social robot ‘flobi’: Key concepts of industrial design. In: RO-MAN, 2010 IEEE, IEEE, pp 107–112

  28. Heise D (2004) Enculturating agents with expressive role behavior. Agent culture. Human-agent interaction in a multicultural world, pp 127–142

  29. Hess U, Blairy S (2001) Facial mimicry and emotional contagion to dynamic emotional facial expressions and their influence on decoding accuracy. Int J Psychophysiol 40(2):129–141

    Article  Google Scholar 

  30. Hoffmann H, Kessler H, Eppel T, Rukavina S, Traue HC (2010) Expression intensity, gender and facial emotion recognition: women recognize only subtle facial emotions better than men. Acta Psychol 135(3):278–283

    Article  Google Scholar 

  31. Huang C, Mutlu B (2012) Robot behavior toolkit: generating effective social behaviors for robots. In: Proceedings of the seventh annual ACM/IEEE international conference on human-robot interaction, ACM, pp 25–32

  32. Kirby R, Forlizzi J, Simmons R (2010) Affective social robots. Robot Auton Syst 58(3):322–332

    Article  Google Scholar 

  33. Kobayashi H, Hara F, Tange A (1994) A basic study on dynamic control of facial expressions for face robot. In: Robot and human communication, 1994. RO-MAN’94 Nagoya, proceedings., 3rd IEEE international workshop on, IEEE, pp 168–173

  34. Koda T, Ishida T, Rehm M, André E (2009) Avatar culture: cross-cultural evaluations of avatar facial expressions. AI Soc 24(3):237–250

    Article  Google Scholar 

  35. Kröse B, Porta J, van Breemen A, Crucq K, Nuttin M, Demeester E (2003) Lino, the user-interface robot. Ambient intelligence, pp 264–274

  36. Kühnlenz K, Sosnowski S, Buss M (2010) Impact of animal-like features on emotion expression of robot head eddie. Adv Robot 24(8–9):1239–1255

    Article  Google Scholar 

  37. Kuratate T, Matsusaka Y, Pierce B, Cheng, G (2011) “mask-bot”: a life-size robot head using talking head animation for human-robot communication. In: Humanoid robots (humanoids), 2011 11th IEEE-RAS international conference on, IEEE, pp 99–104

  38. Lee H, Park J, Chung M (2007) A linear affect-expression space model and control points for mascot-type facial robots. IEEE Trans Robot 23(5):863–873

    Article  Google Scholar 

  39. Liu C, Conn K, Sarkar N, Stone W (2008) Online affect detection and robot behavior adaptation for intervention of children with autism. IEEE Trans Robot 24(4):883–896

    Article  Google Scholar 

  40. Liu C, Ishi C, Ishiguro H, Hagita N (2012) Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction. In: Human-robot interaction (HRI), 2012 7th ACM/IEEE international conference on, IEEE, pp 285–292

  41. Matsumoto D (1992) More evidence for the universality of a contempt expression. Motiv Emot 16(4):363–368

    Article  Google Scholar 

  42. Meerbeek B, Saerbeck M, Bartneck C (2009) Iterative design process for robots with personality. In: AISB2009 symposium on New Frontiers in human-robot interaction. SSAISB. Citeseer

  43. Mehrabian A (1996) Pleasure-arousal-dominance: a general framework for describing and measuring individual differences in temperament. Curr Psychol 14(4):261–292

    Article  MathSciNet  Google Scholar 

  44. Minato T, Shimada M, Itakura S, Lee K, Ishiguro H (2006) Evaluating the human likeness of an android by comparing gaze behaviors elicited by the android and a person. Adv Robot 20(10):1147–1163

    Article  Google Scholar 

  45. Moore RK (2012) A bayesian explanation of the ’uncanny valley’ effect and related psychological phenomena. Sci Rep 2:864. doi:10.1038/srep00864

    Article  Google Scholar 

  46. Mori M (1970) The uncanny valley. Energy 7(4):33–35

    Google Scholar 

  47. Nishio S, Ishiguro H, Hagita N (2007) Geminoid: Teleoperated android of an existing person. Humanoid robots-new developments. I-Tech 14

  48. Noël S, Dumoulin S, Lindgaard G (2009) Interpreting human and avatar facial expressions. In: Human-computer interaction-INTERACT 2009, Springer, pp 98–110

  49. Nourbakhsh I, Bobenage J, Grange S, Lutz R, Meyer R, Soto A (1999) An affective mobile robot educator with a full-time job. Artif Intell 114(1):95–124

    Article  MATH  Google Scholar 

  50. Reeves B, Nass C (1996) The media equation: how people treat computers, television, and new media like real people and places. Center for the Study of Language and Information, Chicago; Cambridge University Press, New York

  51. Saldien J, Goris K, Vanderborght B, Vanderfaeillie J, Lefeber D (2010) Expressing emotions with the social robot probo. Int J Soc Robot 2(4):377–389

    Article  Google Scholar 

  52. Scheeff M, Pinto J, Rahardja K, Snibbe S, Tow R (2002) Experiences with sparky, a social robot. Socially Intelligent Agents, pp 173–180

  53. Shibata H, Kanoh M, Kato S, Itoh H (2006) A system for converting robot’emotion’into facial expressions. In: Robotics and automation, 2006. ICRA 2006. Proceedings 2006 IEEE international conference on, IEEE, pp 3660–3665

  54. Sosnowski S, Bittermann A, Kuhnlenz K, Buss M (2006) Design and evaluation of emotion-display eddie. In: Intelligent robots and systems, 2006 IEEE/RSJ international conference on, IEEE, pp 3113–3118

  55. Spreng RN, McKinnon MC, Mar RA, Levine B (2009) The toronto empathy questionnaire: scale development and initial validation of a factor-analytic solution to multiple empathy measures. J Personal Assessm 91(1):62–71

    Article  Google Scholar 

  56. Takanishi A, Sato K, Segawa K, Takanobu H, Miwa H (2000) An anthropomorphic head-eye robot expressing emotions based on equations of emotion. In: Robotics and automation, 2000. Proceedings. ICRA’00. IEEE international conference on, vol 3. IEEE, pp 2243–2249

  57. Thomaz A, Berlin M, Breazeal C (2005) An embodied computational model of social referencing. In: Robot and human interactive communication, 2005. ROMAN 2005. IEEE international workshop on, IEEE, pp 591–598

  58. Vlachos E, Schärfe H (2012) Android emotions revealed. Social robotics. Springer, Berlin

    Google Scholar 

  59. Woods S, Walters M, Koay K.L, Dautenhahn K (2006) Comparing human robot interaction scenarios using live and video based methods: towards a novel methodological approach. In: Advanced motion control, 2006. 9th IEEE international workshop on, IEEE, pp 750–755

  60. Zecca M, Roccella S, Carrozza M, Miwa H, Itoh K, Cappiello G, Cabibihan J, Matsumoto M, Takanob, H, Dario P et al (2004) On the development of the emotion expression humanoid robot we-4rii with rch-1. In: Humanoid robots, 2004 4th IEEE/RAS international conference on, vol 1. IEEE, pp 235–252

Download references

Acknowledgments

This work is supported within the European Commission as part of the IURO project, see also http://www.iuro-project.eu.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nicole Mirnig.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mirnig, N., Strasser, E., Weiss, A. et al. Can You Read My Face?. Int J of Soc Robotics 7, 63–76 (2015). https://doi.org/10.1007/s12369-014-0261-z

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-014-0261-z

Keywords

Navigation