Skip to main content

Advertisement

Log in

Cross-Cultural Perspectives on Emotion Expressive Humanoid Robotic Head: Recognition of Facial Expressions and Symbols

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Emotion display through facial expressions is an important channel of communication. However, between humans there are differences in the way a meaning to facial cues is assigned, depending on the background culture. This leads to a gap in recognition rates of expressions: this problem is present when displaying a robotic face too, as a robot’s facial expression recognition is often hampered by a cultural divide, and poor scores of recognition rate may lead to poor acceptance and interaction. It would be desirable if robots could switch their output facial configuration flexibly, adapting to different cultural backgrounds. To achieve this, we made a generation system that produces facial expressions and applied it to the 24 degrees of freedom head of the humanoid social robot KOBIAN-R, and thanks to the work of illustrators and cartoonists, the system can generate two versions of the same expression, in order to be easily recognisable by both Japanese and Western subjects. As a tool for making recognition easier, the display of Japanese comic symbols on the robotic face has also been introduced and evaluated. In this work, we conducted a cross-cultural study aimed at assessing this gap in recognition and finding solutions for it. The investigation was extended to Egyptian subjects too, as a sample of another different culture. Results confirmed the differences in recognition rates, the effectiveness of customising expressions, and the usefulness of symbols display, thereby suggesting that this approach might be valuable for robots that in the future will interact in a multi-cultural environment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Mehrabian A, Wiener M (1967) Decoding of inconsistent communications. J Pers Soc Psychol 6(1):109–114

    Article  Google Scholar 

  2. Masaki Y, Maddux WW, Masuda T (2007) Are the windows to the soul the same in the East and West? Cultural differences in using the eyes and mouth as cues to recognise emotions in Japan and the US. J Exp Soc Psychol 43(2):303–311

    Article  Google Scholar 

  3. Buck R (1984) The communication of emotion. Guildford Press, New York

    Google Scholar 

  4. Elfenbein HA, Ambady N (2003) Universals and cultural differences in recognizing emotions. Curr Dir Psychol Sci 12(5):159–164

    Article  Google Scholar 

  5. Ekman P, Friesen WV (1971) Constants across cultures in the face and emotion. J Pers Soc Psychol 17(2):124–129

    Article  Google Scholar 

  6. Shimoda K et al (1978) The intercultural recognition of emotional expressions by three national racial groups: English, Italian and Japanese. Eur J Soc Psychol 8:169–179

    Article  Google Scholar 

  7. Koda T (2008) Cross-cultural evaluations of avatar facial expressions. In: Workshop on enculturating conversational inter-faces (IUI), Gran Canaria, Spain

    Google Scholar 

  8. Trovato G, Kishi T, Endo N, Hashimoto K, Takanishi A (2012) A cross-cultural study on generation of culture dependent facial expressions of humanoid social robot. In: Ge SS, Khatib O, Cabibihan J-J et al (eds) Social robotics. Springer, Berlin, pp 35–44

    Chapter  Google Scholar 

  9. Kishi T, Otani T, Endo N, Kryczka P, Hashimoto K, Nakata K, Takanishi A (2012) Development of expressive robotic head for bipedal humanoid robot. In: Proceedings of the IEEE/RSJ international conference on intelligent robots and systems, pp 4584–4589

    Google Scholar 

  10. Johnston O, Thomas F (1995) The illusion of life: Disney animation. Rev Sub Disney Editions

  11. Hosoi A (2011) Kyarakutaa no yutaka na kanjou hyougen no kakikata. Seibundo-shinkosha, Tokyo, Japan (in Japanese)

  12. Beira R, Lopes M, Praa M, Santos-Victor J, Bernardino A, Metta G, Becchi F, Saltarn RG (2006) Design of the robot-cub (iCub) head. In: IEEE int conf robotics and automation (ICRA), pp 94–100

    Google Scholar 

  13. Oh JH, Hanson D, Kim WS, Han IY, Kim JY, Park IW (2006) Design of android type humanoid robot Albert HUBO. In: Proceedings of the IEEE/RSJ international conference on intelligent robots and systems, pp 1428–1433

    Google Scholar 

  14. Itoh K, Miwa H, Zecca M, takanobu H, Roccella S, Carrozza MC, Dario P, Takanishi A (2006) Mechanical design of emotion expression humanoid robot (WE-4RII). In: 16th CISM-IFToMM symposium on robot design, dynamics and control, pp 255–262

    Google Scholar 

  15. Endo N, Takanishi A (2011) Development of whole-body e motional expression humanoid robot for ADL-assistive RT services. J Robot Mechatron 23(6):969–977

    Google Scholar 

  16. Ekman P, Sorenson ER, Friesen WV (1969) Pan-cultural elements in facial displays of emotions. Science 164:86–88

    Article  Google Scholar 

  17. Becker-Asano C, Ishiguro H (2011) Evaluating facial displays of emotion for the android robot geminoid F. In: 2011 IEEE workshop on affective computational intelligence (WACI), pp 1–8

    Chapter  Google Scholar 

  18. Hegel F, Eyssel FA, Wrede B (2010) The social robot Flobi: key concepts of industrial design. In: Proceedings of the 19th IEEE international symposium in robot and human interactive communication (RO-MAN 2010), pp 120–125

    Google Scholar 

  19. Van Breemen A, Yan X, Meerbeek B (2005) ICat: an animated user-interface robot with personality. In: Proceedings of the fourth international joint conference on autonomous agents and multiagent systems. ACM, New York, pp 143–144

    Chapter  Google Scholar 

  20. Kȩdzierski J, Muszyński R, Zoll C et al (2013) EMYS—emotive head of a social robot. Int J Soc Robot 5:237–249. doi:10.1007/s12369-013-0183-1

    Article  Google Scholar 

  21. Breazeal B (2002) Emotion and sociable humanoid robots. Int J Hum-Comput Interact 59:119–155

    Google Scholar 

  22. Mazzei D, Lazzeri N, Hanson D, De Rossi D (2012) HEFES: an hybrid engine for facial expressions synthesis to control human-like androids and avatars. In: 2012 4th IEEE RAS EMBS international conference on biomedical robotics and biomechatronics (BioRob), pp 195–200

    Chapter  Google Scholar 

  23. Beck A, Hiolle A, Mazel A, Cañamero L (2010) Interpretation of emotional body language displayed by robots. In: Proceedings of the 3rd international workshop on affective interaction in natural environments, Firenze, Italy

    Google Scholar 

  24. Asimov I (1978) The machine and the robot. In: Warrick PS, Greenberg MH, Olander JD (eds) Science fiction: contemporary mythology. Harper & Row, New York

    Google Scholar 

  25. Schodt FL (1988) Inside the robot kingdom—Japan, mechatronics, and coming robotopia. Kodansha, Tokyo, New York

    Google Scholar 

  26. Latour B (1991) Nous n’avons jamais ’et’e modernes (La d’ecouverte). English translation revised and augmented: we have never been modern. Simon & Schuster/Harvard University Press, Cambridge

    Google Scholar 

  27. Nisbett RE (2004) The geography of thought: how Asians and Westerners think differently—and why. Free Press, New York

    Google Scholar 

  28. Hersey GL (2009) Falling in love with statues: artificial humans from pygmalion to the present. University of Chicago Press, Chicago

    Google Scholar 

  29. Telotte JP (1995) Replications: a robotic history of the science fiction film. University of Illinois Press, Illinois

    Google Scholar 

  30. Bartneck C, Nomura T, Kanda T, Suzuki T, Kennsuke K (2005) A cross-cultural study on attitudes towards robots. In: Proceedings of the HCI international, Las Vegas, USA

    Google Scholar 

  31. Bartneck C, Suzuki T, Kanda T, Nomura T (2007) The influence of people’s culture and prior experiences with Aibo on their attitude towards robots. AI & Soc 21:217–230

    Article  Google Scholar 

  32. Dario P, Guglielmelli E, Laschi C, Teti G (1999) MOVAID: a personal robot in everyday life of disabled and elderly people. Technol Disabil 10(2):77–93

    Google Scholar 

  33. Evers V, Maldonado HC, Brodecki TL, Hinds PJ (2008) Relational vs. group self-construal: untangling the role of national culture in HRI. In: Proceedings of the 3rd ACM/IEEE international conference on human robot interaction. ACM, New York, pp 255–262

    Chapter  Google Scholar 

  34. Rau PLP, Li Y, Li D (2009) Effects of communication style and culture on ability to accept recommendations from robots. Comput Hum Behav 25:587–595

    Article  Google Scholar 

  35. Arras KO (2005) Do we want to share our lives and bodies with robots? A 2000-people survey. Autonomous Systems Lab. EPFL, Technical report Nr. 0605-001. Available: http://publications.asl.ethz.ch/files/arras05want.pdf

  36. Riek LD, Mavridis N, Antali S, Darmaki N, Ahmed Z, Al-Neyadi M, Alketheri A (2010) Ibn sina steps out: exploring Arabic attitudes toward humanoid robots. In: Proceedings of second international symposium on new frontiers in human-robot interaction

    Google Scholar 

  37. Makatchev M, Simmons R, Sakr M, Ziadee M (2013) Expressing ethnicity through behaviors of a robot character. In: Proceedings of the 8th ACM/IEEE international conference on human-robot interaction (HRI), pp 357–364

    Google Scholar 

  38. Saif M (2006) The evolution of Persian thought regarding art and figural representation in secular and religious life after the coming of Islam. Macalester Islam J. 1(2):53–95

    Google Scholar 

  39. Straub DW, Loch KD, Hill CE (2003) Transfer of information technology to the Arab world: a test of cultural influence modeling. In: Tan FB (ed) Advanced topics in global information management. IGI Publishing, Hershey, pp 141–172

    Google Scholar 

  40. Albirini A (2006) Teachers’ attitudes toward information and communication technologies: the case of Syrian EFL teachers. Comput Educ 47(4):373–398

    Article  Google Scholar 

  41. Thomas RM (1987) Computer technology: an example of decision-making in technology transfer. In: Thomas RM, Kobayashi VN (eds) Educational technology—its’ creation, development and cross-cultural transfer. Pergamon, Oxford, pp 25–34

    Google Scholar 

  42. Rogers EM (1995) Diffusion of innovations, 4th edn. Free Press, New York

    Google Scholar 

  43. Ogura Y et al (2006) Development of a new humanoid robot WABIAN-2. In: Proc IEEE int conf on robotics and automation, Orlando, USA, pp 76–81

    Google Scholar 

  44. Trovato G, Kishi T, Endo N, Hashimoto K, Takanishi A (2012) Development of facial expressions generator for emotion expressive humanoid robot. In: Proceedings of 2012 IEEE-RAS international conference on humanoid robots

    Google Scholar 

  45. Poggi I, Pelachaud C (2000) Performative facial expressions in animated faces. In: Cassell J, Sullivan J, Prevost S, Churchill E (eds) Embodied conversational agents. MIT Press, Cambridge

    Google Scholar 

  46. Plutchik R (2002) Emotions and life: perspectives from psychology, biology, and evolution. American Psychological Association, Washington

    Google Scholar 

  47. Ekman P, Friesen WV, Hager JC (2002) The facial action coding system, 2nd edn. Weidenfeld & Nicolson, London

    Google Scholar 

  48. Poggi I (2001–2002) Towards the alphabet and the lexicon of gesture, gaze and touch. In: Bouissac P (ed) Multimodality of human communication. theories, problems and applications, virtual symposium. http://www.semioticon.com/virtuals/index.html

    Google Scholar 

  49. Poggi I (2006). Le parole del corpo. Carrocci, Roma (in Italian)

  50. Raudys S, Duin RPW (1998) On expected classication error of the Fisher linear classifier with pseudo-inverse covariance matrix. Pattern Recognit Lett 19(5–6):385–392

    Article  MATH  Google Scholar 

  51. Trovato G, Kishi T, Endo N, Hashimoto K, Takanishi A (2012) Evaluation study on asymmetrical facial expressions generation for humanoid robot. In: 2012 first international conference on innovative engineering systems (ICIES), pp 129–134

    Chapter  Google Scholar 

  52. Saldien J, Goris K, Vanderborght B, Vanderfaeillie J, Lefeber D (2010) Expressing emotions with the social robot probo. Int J Soc Robot 2:377–389

    Article  Google Scholar 

  53. Morris D (1994) Bodytalk: a world guide to gestures. Jonathan Cape, London

    Google Scholar 

  54. Hall E (1977) Beyond culture. Anchor, Garden City

    Google Scholar 

  55. Li D, Rau PLP, Li Y (2010) A cross-cultural study: effect of robot appearance and task. Int J Soc Robot 2:175–186. doi:10.1007/s12369-010-0056-9

    Article  Google Scholar 

Download references

Acknowledgements

This study was conducted as part of the Research Institute for Science and Engineering, Waseda University, and as part of the humanoid project at the Humanoid Robotics Institute, Waseda University. It was supported in part by RoboSoM project from the European FP7 program (Grant agreement No. 248366), GCOE Program “Global Robot Academia” from the Ministry of Education, Culture, Sports, Science and Technology of Japan, SolidWorks Japan K.K., NiKKi Fron Co., Chukoh Chemical Industries, STMicroelectronics, and DYDEN Corporation, whom we thank for their financial and technical support.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gabriele Trovato.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Trovato, G., Kishi, T., Endo, N. et al. Cross-Cultural Perspectives on Emotion Expressive Humanoid Robotic Head: Recognition of Facial Expressions and Symbols. Int J of Soc Robotics 5, 515–527 (2013). https://doi.org/10.1007/s12369-013-0213-z

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-013-0213-z

Keywords

Navigation