Skip to main content
Log in

Faces of Emotion: Investigating Emotional Facial Expressions Towards a Robot

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Emotions have always been an intriguing topic in everyday life as well as in science. As robots are starting to move from industry halls to our private homes, emotions have become a vital theme for the field of human–robot interaction. Since Darwin, research suggests facial expressions are associated with emotions. Facial expressions could provide an ideal tool for a natural, social human–robot interaction. Despite a growing body of research on the implementation of emotions in robots (mostly based on facial expressions), systematic research on users’ emotions and facial expressions towards robots remains largely neglected (cf. Arkin and Moshkina in Calvo R, D’Mello S, Gratch J, Kappas A (eds) The Oxford handbook of affective computing. Oxford University Press, New York, pp 483–493, 2015 on challenges in effective testing in affective human–robot interaction). We experimentally investigated the multilevel phenomenon of emotions by using a multi-method approach. Since self-reports of emotions are prone to biases such as social desirability, we supplemented it by an objective behavioral measurement. By using the Facial Action Coding System we analyzed the facial expressions of 62 participants who watched the entertainment robot dinosaur Pleo either in a friendly interaction or being tortured. Participants differed in the type and frequency of Action Units displayed as well as in their self-reported feelings depending on the type of treatment they had watched (friendly or torture). In line with a previous study by Rosenthal-von der Pütten et al. (Int J Soc Robot 5(1):17–34, 2013. https://doi.org/10.1007/s12369-012-0173-8), participants reported feeling more positive after the friendly video and more negative after the torture video. In the torture condition, participants furthermore showed a wide range of different Action Units primarily associated with negative emotions. For example, the Action Unit 4 (“Brow Lowerer”) that is common in negative emotions such as anger and sadness was displayed more frequently in the torture condition than in the friendly condition. The Action Unit 12 (“Lip Corner Puller”) however, an Action Unit commonly associated with joy, was present in both conditions and thus not necessarily predictive of positive emotions. The findings indicate the importance for a thorough investigation of the variables of emotional facial expressions. In investigating the Action Units participants display due to an emotional situation, we aim to provide information on spontaneous facial expressions towards a robot that could also serve as guidance for automatic approaches.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Ambady N, Rosenthal R (1992) Thin slices of expressive behavior as predictors of interpersonal consequences: a meta-analysis. Psychol Bull 111(2):256–274. https://doi.org/10.1037/0033-2909.111.2.256

    Article  Google Scholar 

  2. Arkin RC, Moshkina L (2015) Affect in human–robot interaction. In: Calvo R, D’Mello S, Gratch J, Kappas A (eds) The Oxford handbook of affective computing. Oxford University Press, New York, pp 483–493

    Google Scholar 

  3. Austin EJ, Deary IJ, Gibson GJ, McGregor MJ, Dent J (1998) Individual response spread in self-report scales: personality correlations and consequences. Pers Indiv Differ 24(3):421–438. https://doi.org/10.1016/s0191-8869(97)00175-x

    Article  Google Scholar 

  4. Bartneck C (2002) eMuu: an embodied emotional character for the ambient intelligent home. Ph.D. thesis, University Eindhoven. http://www.bartneck.de/publications/2002/eMuu/bartneckPHD Thesis2002.pdf

  5. Bartneck C, Hu J (2008) Exploring the abuse of robots. Interact Stud 9(3):415–433. https://doi.org/10.1075/is.9.3.04bar

    Article  Google Scholar 

  6. Bartneck C, Rosalia C, Menges R, Deckers I (2005) Robot abuse—a limitation of the media equation. In: Proceedings of the interact 2005 workshop on agent abuse, Rome, Italy

  7. Becker-Asano C, Ishiguro H (2011) Evaluating facial displays of emotion for the android robot Geminoid F. In: 2011 IEEE workshop on affective computational intelligence (WACI), Paris, France. https://doi.org/10.1109/waci.2011.5953147

  8. Bethel CL, Murphy RR (2010) Review of human studies methods in HRI and recommendations. Int J Soc Robot 2(4):347–359. https://doi.org/10.1007/s12369-010-0064-9

    Article  Google Scholar 

  9. Brais M, Valstar MF (2016) Advances, challenges, and opportunities in automatic facial expression recognition. In: Kawulok M, Celebi ME, Smolka B (eds) Advances in face detection and facial image analysis. Springer, Cham, pp 63–100

    Google Scholar 

  10. Brave S, Nass C, Hutchinson K (2005) Computers that care: investigating the effects of orientation of emotion exhibited by an embodied computer agent. Int J Hum Comput Stud 62(2):161–178

    Article  Google Scholar 

  11. Breazeal C (2002) Regulation and entrainment in human–robot interaction. Int J Robot Res 21(10–11):883–902. https://doi.org/10.1177/0278364902021010096

    Article  Google Scholar 

  12. Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum Comput St 59(1–2):119–155. https://doi.org/10.1016/s1071-5819(03)00018-1

    Article  Google Scholar 

  13. Cañamero LD (2002) Playing the emotion game with Feelix, chap. Socially intelligent agents. Springer, Boston, pp 69–76

    Book  Google Scholar 

  14. Carroll JM, Russell JA (1997) Facial expressions in Hollywood’s protrayal of emotion. J Pers Soc Psychol 72(1):164–176. https://doi.org/10.1037/0022-3514.72.1.164

    Article  Google Scholar 

  15. Costa S, Soares F, Santos C (2013) Facial expressions and gestures to convey emotions with a humanoid robot. In: Herrmann G, Pearson MJ, Lenz A, Bremner P, Spiers A, Leonards U (eds) Social robotics. Springer, Cham, pp 542–551

    Chapter  Google Scholar 

  16. Craig KD, Hyde SA, Patrick CJ (1991) Genuine, suppressed and faked facial behavior during exacerbation of chronic low back pain. Pain 46(2):161–171. https://doi.org/10.1016/0304-3959(91)90071-5

    Article  Google Scholar 

  17. Cramer H, Goddijn J, Wielinga B, Evers V (2010) Effects of (in) accurate empathy and situational valence on attitudes towards robots. In: 5th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, Istanbul, pp 141–142

  18. Davis MH (1983) Measuring individual differences in empathy: evidence for a multidimensional approach. J Pers Soc Psychol 44(1):113–126

    Article  Google Scholar 

  19. Dimberg U (1982) Facial reactions to facial expressions. Psychophysiology 19(6):643–647. https://doi.org/10.1111/j.1469-8986.1982.tb02516.x

    Article  Google Scholar 

  20. Dimberg U, Thunberg M (1998) Rapid facial reactions to emotional facial expressions. Scand J Psychol 39(1):39–45. https://doi.org/10.1111/1467-9450.00054

    Article  Google Scholar 

  21. Ekman P (1993) Facial expression and emotion. Am Psychol 48(4):384–392. https://doi.org/10.1037/0003-066x.48.4.384

    Article  Google Scholar 

  22. Ekman P, Friesen WV (1969) The repertoire of nonverbal behavior: categories, origins, usage, and coding. Semiotica. https://doi.org/10.1515/semi.1969.1.1.49

    Google Scholar 

  23. Ekman P, Friesen WV (1982) Felt, false, and miserable smiles. J Nonverbal Behav 6(4):238–252. https://doi.org/10.1007/bf00987191

    Article  Google Scholar 

  24. Ekman P, Friesen WV (2003) Unmasking the face: a guide to recognizing emotions from facial clues. Malor Books, Los Altos

    Google Scholar 

  25. Ekman P, Friesen WV, Ancoli S (1980) Facial signs of emotional experience. J Pers Soc Psychol 39(6):1125–1134. https://doi.org/10.1037/h0077722

    Article  Google Scholar 

  26. Ekman P, Friesen WV, Hager JC (2002) FACS investigators guide. Research Nexus eBook, Salt Lake City

    Google Scholar 

  27. Ekman P, Friesen WV, Hager JC (2002) The facial action coding system. Research Nexus eBook, Salt Lake City

    Google Scholar 

  28. Ekman P, Rosenberg EL (2005) What the face reveals. Oxford University Press, New York

    Google Scholar 

  29. Endo N, Momoki S, Zecca M, Saito M, Mizoguchi Y, Itoh K, Takanishi A (2008) Development of whole-body emotion expression humanoid robot. In: 2008 IEEE international conference on robotics and automation. IEEE, Phoenix. https://doi.org/10.1109/robot.2008.4543523

  30. Fan X, Miller BC, Park KE, Winward BW, Christensen M, Grotevant HD, Tai RH (2006) An exploratory study about inaccuracy and invalidity in adolescent self-report surveys. Field Methods 18(3):223–244. https://doi.org/10.1177/152822x06289161

    Article  Google Scholar 

  31. Fasel B, Luettin J (2003) Automatic facial expression analysis: a survey. Pattern Recogn 36(1):259–275. https://doi.org/10.1016/s0031-3203(02)00052-3

    Article  MATH  Google Scholar 

  32. Friesen WV, Ekman P (1983) Emfacs-7: emotional facial action coding system. Unpublished manual

  33. Giudice MD, Colle L (2007) Differences between children and adults in the recognition of enjoyment smiles. Dev Psychol 43(3):796–803. https://doi.org/10.1037/0012-1649.43.3.796

    Article  Google Scholar 

  34. Gross JJ, Levenson RW (1995) Emotion elicitation using films. Cogn Emot 9(1):87–108. https://doi.org/10.1080/02699939508408966

    Article  Google Scholar 

  35. Haxby JV, Hoffman EA, Gobbini M (2000) The distributed human neural system for face perception. Trends Cogn Sci 4(6):223–233. https://doi.org/10.1016/s1364-6613(00)01482-0

    Article  Google Scholar 

  36. Heerink M, Kröse B, Evers V, Wielinga B (2009) Relating conversational expressiveness to social presence and acceptance of an assistive social robot. Virtual Real 14(1):77–84. https://doi.org/10.1007/s10055-009-0142-1

    Article  Google Scholar 

  37. Hegel F, Eyssel FA, Wrede B (2010) The social robot Flobi: key concepts of industrial design. In: Proceedings of the 19th IEEE international symposium in robot and human interactive communication (RO-MAN 2010), Viareggio, Italy, pp 120–125

  38. Kahn PH, Kanda T, Ishiguro H, Freier NG, Severson RL, Gill BT, Ruckert JH, Shen S (2012) Robovie, you will have to go into the closet now: childrens social and moral relationships with a humanoid robot. Dev Psychol 48(2):303–314. https://doi.org/10.1037/a0027033

    Article  Google Scholar 

  39. Kappas A, Krumhuber E, Küster D (2013) Facial behavior. In: Knapp ML (ed) Hall JA. Handbook of communication science Nonverbal communication. deGruyter, Berlin, pp 131–166

    Google Scholar 

  40. Kappas A, Pecchinenda A (1998) Zygomaticus major activity is not a selective indicator of positive affective state in ongoing interactive tasks. Psychophysiology 35:S44

    Google Scholar 

  41. Larsen JT, Norris CJ, Cacioppo JT (2003) Effects of positive and negative affect on electromyographic activity over zygomaticus major and corrugator supercilii. Psychophysiology 40(5):776–785. https://doi.org/10.1111/1469-8986.00078

    Article  Google Scholar 

  42. Lazarus RS (1991) Progress on a cognitive-motivational-relational theory of emotion. Am Psychol 46(8):819–834. https://doi.org/10.1037/0003-066x.46.8.819

    Article  Google Scholar 

  43. Lien JJJ, Kanade T, Cohn JF, Li CC (2000) Detection, tracking, and classification of action units in facial expression. Robot Auton Syst 31(3):131–146. https://doi.org/10.1016/s0921-8890(99)00103-7

    Article  Google Scholar 

  44. Menne IM, Schnellbacher C, Schwab F (2016) Facing emotional reactions towards a robot. an experimental study using FACS. In: Agah A, Cabibihan J, Howard A, M MS, He H (eds) Social robotics. Springer, Cham, pp 372–381

    Chapter  Google Scholar 

  45. Mirnig N, Strasser E, Weiss A, Kühnlenz B, Wollherr D, Tscheligi M (2014) Can you read my face? Int J Soc Robot 7(1):63–76. https://doi.org/10.1007/s12369-014-0261-z

    Article  Google Scholar 

  46. Öhman A (2002) Automaticity and the amygdala: nonconscious responses to emotional faces. Curr Dir Psychol Sci 11(2):62–66

    Article  MathSciNet  Google Scholar 

  47. Prkachin KM (1992) The consistency of facial expressions of pain: a comparison across modalities. Pain 51(3):297–306. https://doi.org/10.1016/0304-3959(92)90213-u

    Article  Google Scholar 

  48. Reed LI, Zeglen KN, Schmidt KL (2012) Facial expressions as honest signals of cooperative intent in a one-shot anonymous prisoners dilemma game. Evol Hum Behav 33(3):200–209. https://doi.org/10.1016/j.evolhumbehav.2011.09.003

    Article  Google Scholar 

  49. Reeves B, Nass C (1996) The media equation. How people treat computers, television, and new media like real people and places. CSLI Publications, Stanford

    Google Scholar 

  50. Ribeiro T, Paiva A (2012) The illusion of robotic life. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction—HRI 12. ACM Press, Boston. https://doi.org/10.1145/2157689.2157814

  51. Riether N (2013) On the profoundness and preconditions of social responses towards social robots. Experimental investigations using indirect measurement techniques. Ph.D. thesis, Universität Bielefeld

  52. Rosenthal-von der Pütten AM, Krämer NC, Hoffmann L, Sobieraj S, Eimler SC (2013) An experimental study on emotional reactions towards a robot. Int J Soc Robot 5(1):17–34. https://doi.org/10.1007/s12369-012-0173-8

  53. der Pütten Rosenthal-von AM, Schulte FP, Schulte SC, Sobieraj S, Hoffmann L, Maderwald S, Brand M, Krämer NC (2014) Investigations on empathy towards humans and robots using fMRI. Comput Hum Behav 33:201–212. https://doi.org/10.1016/j.chb.2014.01.004

    Article  Google Scholar 

  54. Russell JA, Fernández-Dols JM (1997) The psychology of facial expression. Cambridge University Press, New York

  55. Sato W, Yoshikawa S (2007) Spontaneous facial mimicry in response to dynamic facial expressions. Cognition 104(1):1–18. https://doi.org/10.1016/j.cognition.2006.05.001

  56. Scherer KR, Schorr A, Johnstone T (2001) Appraisal processes in emotion: theory, methods, research. Oxford University Press, New York

    Google Scholar 

  57. Takahashi Y, Hatakeyama M (2008) Fabrication of simple robot face regarding experimental results of human facial expressions. In: 2008 international conference on control, automation and systems. IEEE, Seoul, Korea. https://doi.org/10.1109/iccas.2008.4694495

  58. Unz D, Schwab F, Winterhoff-Spurk P (2008) TV news—the daily horror? Media Psychol 20(4):141–155. https://doi.org/10.1027/1864-1105.20.4.141

    Article  Google Scholar 

  59. Watson D, Clark LA, Tellegen A (1988) Development and validation of brief measures of positive and negative affect. The PANAS scales. J Pers Soc Psychol 54(6):1063–1070. https://doi.org/10.1037/0022-3514.54.6.1063

    Article  Google Scholar 

  60. Wilcox RR (2011) Introduction to robust estimation and hypothesis testing. Academic Press, Waltham

    MATH  Google Scholar 

  61. Woods S, Walters M, Koay KL, Dautenhahn K (2006) Comparing human robot interaction scenarios using live and video based methods: towards a novel methodological approach. In: 9th IEEE international workshop on advanced motion control. IEEE, Istanbul. https://doi.org/10.1109/amc.2006.1631754

Download references

Acknowledgements

We gratefully acknowledge the assistance of Christin Schnellbacher in the conduction of the study.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Isabelle M. Menne.

Ethics declarations

Conflict of Interest

The authors declare that they have no conflict of interest.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Menne, I.M., Schwab, F. Faces of Emotion: Investigating Emotional Facial Expressions Towards a Robot. Int J of Soc Robotics 10, 199–209 (2018). https://doi.org/10.1007/s12369-017-0447-2

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-017-0447-2

Keywords

Navigation