Skip to main content
Log in

Emotional Influence of Pupillary Changes of Robots with Different Human-Likeness Levels on Human

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

This study explored the emotional influence of pupillary change (PC) of robots with different human-likeness levels on people. Images of the eye areas of five agents, including one human and four existing typical humanoid robots with varying human-likeness levels, were edited into five 27-s videos. In the experimental group, we showed five videos with PC applied to the eyes of agents to 31 participants, and in the control group, five videos without PC were shown to another 31 participants. Afterward, the participants were asked to rate their feelings about the videos. The results showed that PC did not change people’s emotions towards agents independently. However, PC applied to the eyes of a robot representing an agent of no threat who may evoke empathy subconsciously enhanced people’s positive emotions, while PC applied to human images increased people’s negative emotions and reduced the feeling of familiarity.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Availability of data and material

Upon request.

References

  1. de Jong C, Kühne R, Peter J et al (2020) Intentional acceptance of social robots: development and validation of a self-report measure for children. Int J Hum Comput Stud 139:102426. https://doi.org/10.1016/j.ijhcs.2020.102426

    Article  Google Scholar 

  2. Desideri L, Ottaviani C, Malavasi M et al (2019) Emotional processes in human-robot interaction during brief cognitive testing. Comput Hum Behav 90:331–342. https://doi.org/10.1016/j.chb.2018.08.013

    Article  Google Scholar 

  3. Bartneck C, Kanda T, Ishiguro H, Hagita N (2009) My robotic doppelgänger - a critical look at the Uncanny Valley. In: RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication. pp 269–276

  4. Mori M, MacDorman KF, Kageki N (2012) The uncanny valley [From the Field]. IEEE Robot Autom Mag 19:98–100. https://doi.org/10.1109/MRA.2012.2192811

    Article  Google Scholar 

  5. Mathur MB, Reichling DB (2016) Navigating a social world with robot partners: a quantitative cartography of the uncanny valley. Cognition 146:22–32. https://doi.org/10.1016/j.cognition.2015.09.008

    Article  Google Scholar 

  6. Yu C-E, Ngan HFB (2019) The power of head tilts: gender and cultural differences of perceived human vs human-like robot smile in service. Tourism Rev 74:428–442. https://doi.org/10.1108/TR-07-2018-0097

    Article  Google Scholar 

  7. Mavridis N (2015) A review of verbal and non-verbal human–robot interactive communication. Robot Auton Syst 63:22–35. https://doi.org/10.1016/j.robot.2014.09.031

    Article  MathSciNet  Google Scholar 

  8. Adăscăliţei F, Doroftei I (2012) Expressing emotions in social robotics - a schematic overview concerning the mechatronics aspects and design concepts. IFAC Proceedings Volumes 45:823–828. https://doi.org/10.3182/20120523-3-RO-2023.00321

    Article  Google Scholar 

  9. Kozima H, Nakagawa C, Yano H (2004) Can a robot empathize with people? Artificial Life & Robot 8:83–88

    Article  Google Scholar 

  10. Abubshait A, Wiese E (2017) You look human, but act like a machine: agent appearance and behavior modulate different aspects of human-robot interaction. Front Psychol 8:1393. https://doi.org/10.3389/fpsyg.2017.01393

    Article  Google Scholar 

  11. Lapidot-Lefler N, Barak A (2012) Effects of anonymity, invisibility, and lack of eye-contact on toxic online disinhibition. Comput Hum Behav 28:434–443. https://doi.org/10.1016/j.chb.2011.10.014

    Article  Google Scholar 

  12. Andreallo F, Chesher C (2021) Eye machines: robot eye, vision and gaze. Int J Soc Robot

  13. Kompatsiari K, Ciardo F, Tikhanoff V et al (2021) It’s in the eyes: the engaging role of eye contact in HRI. Int J Soc Robot 13:525–535. https://doi.org/10.1007/s12369-019-00565-4

    Article  Google Scholar 

  14. Carsten T, Desmet C, Krebs RM, Brass M (2019) Pupillary contagion is independent of the emotional expression of the face. Emotion 19:1343–1352. https://doi.org/10.1037/emo0000503

    Article  Google Scholar 

  15. Kret ME (2018) The role of pupil size in communication. Is there room for learning? Null 32:1139–1145. https://doi.org/10.1080/02699931.2017.1370417

    Article  Google Scholar 

  16. Mathôt S, Melmi J-B, Van Der Linden L, Van der Stigchel S (2016) The mind-writing pupil: a human-computer interface based on decoding of covert attention through pupillometry. PLoS ONE 11:e0148805

    Article  Google Scholar 

  17. Naber M, Alvarez GA, Nakayama K (2013) Tracking the allocation of attention using human pupillary oscillations. Front Psychol 4:919

    Article  Google Scholar 

  18. Stoll J, Chatelle C, Carter O et al (2013) Pupil responses allow communication in locked-in syndrome patients. Curr Biol 23:R647–R648

    Article  Google Scholar 

  19. Sejima Y, Egawa S, Sato Y, Watanabe T (2019) A pupil response system using hemispherical displays for enhancing affective conveyance. Journal of Advanced Mechanical Design, Systems, and Manufacturing 13:JAMDSM0032–JAMDSM0032. https://doi.org/10.1299/jamdsm.2019jamdsm0032

  20. Rincon JA, Costa A, Novais P et al (2019) A new emotional robot assistant that facilitates human interaction and persuasion. Knowl Inf Syst 60:363–383. https://doi.org/10.1007/s10115-018-1231-9

    Article  Google Scholar 

  21. Giger J-C, Piçarra N, Alves-Oliveira P et al (2019) Humanization of robots: Is it really such a good idea? Human Behav and Emerg Technol 1:111–123. https://doi.org/10.1002/hbe2.147

    Article  Google Scholar 

  22. Fink J (2012) Anthropomorphism and human likeness in the design of robots and human-robot interaction. Springer, pp 199–208

  23. Nehaniv CL, Dautenhahn KE (2007) Imitation and social learning in robots, humans and animals: behavioural, social and communicative dimensions. Cambridge University Press, UK

    Book  Google Scholar 

  24. Malinowska JK (2021) What does it mean to empathise with a robot? Mind Mach 31:361–376. https://doi.org/10.1007/s11023-021-09558-7

    Article  Google Scholar 

  25. Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42:143–166. https://doi.org/10.1016/S0921-8890(02)00372-X

    Article  MATH  Google Scholar 

  26. Rosenthal-von der Pütten AM, Krämer NC (2014) How design characteristics of robots determine evaluation and uncanny valley related responses. Comput Hum Behav 36:422–439. https://doi.org/10.1016/j.chb.2014.03.066

    Article  Google Scholar 

  27. Mathur MB, Reichling DB, Lunardini F et al (2020) Uncanny but not confusing: multisite study of perceptual category confusion in the uncanny valley. Comput Hum Behav 103:21–30. https://doi.org/10.1016/j.chb.2019.08.029

    Article  Google Scholar 

  28. MacDorman KF, Ishiguro H (2006) The uncanny advantage of using androids in cognitive and social science research. Interact Stud 7:297–337

    Article  Google Scholar 

  29. Shimada M, Minato T, Itakura S, Ishiguro H (2007) Uncanny valley of androids and its lateral inhibition hypothesis. In: RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication. pp 374–379

  30. Weisman WD, Peña JF (2021) Face the uncanny: the effects of doppelganger talking head avatars on affect-based trust toward artificial intelligence technology are mediated by uncanny valley perceptions. Cyberpsychol Behav Soc Netw 24:182–187. https://doi.org/10.1089/cyber.2020.0175

    Article  Google Scholar 

  31. Reichardt J (1978) Robots: Fact, fiction, and prediction. Thames and Hudson Ltd

  32. Thepsoonthorn C, Ogawa K, Miyake Y (2021) The exploration of the uncanny valley from the viewpoint of the robot’s nonverbal behaviour. Int J Soc Robot 13:1443–1455. https://doi.org/10.1007/s12369-020-00726-w

    Article  Google Scholar 

  33. Petrak B, Stapels J, Weitz K, et al (2021) To move or not to move? social acceptability of robot proxemics behavior depending on user emotion. Electr Network, pp 975–982

  34. Prati E, Peruzzini M, Pellicciari M, Raffaeli R (2021) How to include User eXperience in the design of human-robot interaction. Robot Comput-Integrat Manufact 68:102072. https://doi.org/10.1016/j.rcim.2020.102072

    Article  Google Scholar 

  35. Rossi S, Ruocco M (2019) Better alone than in bad company: effects of incoherent non-verbal emotional cues for a humanoid robot. Interact Stud 20:487–508

    Article  Google Scholar 

  36. Spezialetti M, Placidi G, Rossi S (2020) Emotion recognition for human-robot interaction: recent advances and future perspectives. Front Robot AI 7:532279–532279. https://doi.org/10.3389/frobt.2020.532279

    Article  Google Scholar 

  37. Egawa S, Sejima Y, Sato Y, Watanabe T (2016) A laughing-driven pupil response system for inducing empathy. In: 2016 IEEE/SICE International Symposium on System Integration (SII). pp 520–525

  38. Naber M, Stoll J, Einhäuser W, Carter O (2013) How to become a mentalist: reading decisions from a competitor’s pupil can be achieved without training but requires instruction. PLoS ONE 8:e73302

    Article  Google Scholar 

  39. Derksen M, van Alphen J, Schaap S, et al (2018) Pupil mimicry is the result of brightness perception of the iris and pupil. J Cognition 1:

  40. Kret ME, De Dreu CKW (2017) Pupil-mimicry conditions trust in partners: moderation by oxytocin and group membership. Proc Biol Sci 284:20162554. https://doi.org/10.1098/rspb.2016.2554

    Article  Google Scholar 

  41. Harrison NA, Tania S, Pia R, et al (2006) Pupillary contagion: central mechanisms engaged in sadness processing. Social Cognitive & Affective Neuroscience 5

  42. Shiori A, Kuni O (2012) Effect of the observed pupil size on the amygdala of the beholders. Social Cognitive & Affective Neuroscience pp. 332–341

  43. Walker HK, Hall WD, Hurst JW (1990) Clinical methods: the history, physical, and laboratory examinations

  44. Fotiou DF, Brozou CG, Haidich A-B et al (2007) Pupil reaction to light in Alzheimer’s disease: evaluation of pupil size changes and mobility. Aging Clin Exp Res 19:364–371

    Article  Google Scholar 

  45. Vunda A, Alcoba G (2012) Mydriasis in the garden. N Engl J Med 367:1341. https://doi.org/10.1056/NEJMicm1208053

    Article  Google Scholar 

  46. Clusmann H, Schaller C, Schramm J (2001) Fixed and dilated pupils after trauma, stroke, and previous intracranial surgery: management and outcome. J Neurol Neurosurg Psychiatry 71:175–181

    Article  Google Scholar 

  47. Sejima Y, Sato Y, Watanabe T (2021) A body contact-driven pupil response pet-robot for enhancing affinity. Journal of Advanced Mechanical Design, Systems, and Manufacturing 15:JAMDSM0061–JAMDSM0061. https://doi.org/10.1299/jamdsm.2021jamdsm0061

  48. Spicer C, Khwaounjoo P, Cakmak YO (2021) Human and human-interfaced ai interactions: modulation of human male autonomic nervous system via pupil mimicry. Sensors 21:1028. https://doi.org/10.3390/s21041028

    Article  Google Scholar 

  49. Breazeal C (2003) Toward sociable robots. Robot Auton Syst 42:167–175. https://doi.org/10.1016/S0921-8890(02)00373-1

    Article  MATH  Google Scholar 

  50. Reeves B, Nass C (1996) The media equation: how people treat computers, television, and new media like real people. Cambridge University Press, Cambridge, United Kingdom

    Google Scholar 

  51. Hareli S, Hess U (2012) The social signal value of emotions. Null 26:385–389. https://doi.org/10.1080/02699931.2012.665029

    Article  Google Scholar 

  52. Kafetsios K, Chatzakou D, Tsigilis N, Vakali A (2017) Experience of emotion in face to face and computer-mediated social interactions: an event sampling study. Comput Hum Behav 76:287–293. https://doi.org/10.1016/j.chb.2017.07.033

    Article  Google Scholar 

  53. Watson AB, Yellott JI (2012) A unified formula for light-adapted pupil size. J Vis 12:12. https://doi.org/10.1167/12.10.12

    Article  Google Scholar 

  54. Furnham A (2019) Advertising: the contribution of applied cognitive psychology. Appl Cogn Psychol 33:168–175. https://doi.org/10.1002/acp.3458

    Article  Google Scholar 

  55. Taake, K. (2009). A comparison of natural and synthetic speech: with and without simultaneous reading. All Theses and Dissertations (ETDs), Washington University

  56. Reuten A, van Dam M, Naber M (2018) Pupillary responses to robotic and human emotions: the uncanny valley and media equation confirmed. Front Psychol 9:774. https://doi.org/10.3389/fpsyg.2018.00774

    Article  Google Scholar 

  57. Ji D, Song B, Yu T (2007) The method of decision-making based on FAHP and its application. Fire Control and Command Control 32:38–41. https://doi.org/10.3969/j.issn.1002-0640.2007.11.011

    Article  Google Scholar 

  58. Lovakov A, Agadullina ER (2021) Empirically derived guidelines for effect size interpretation in social psychology. Eur J Soc Psychol 51:485–504. https://doi.org/10.1002/ejsp.2752

    Article  Google Scholar 

  59. Vargas A, Cominelli L, Dell’Orletta F, Scilingo E (2021) Verbal communication in robotics: a study on salient terms, research fields and trends in the last decades based on a computational linguistic analysis. Front Comput Sci 2:591164. https://doi.org/10.3389/fcomp.2020.591164

    Article  Google Scholar 

  60. Wang S (2018) Face size biases emotion judgment through eye movement. Sci Rep. https://doi.org/10.1038/s41598-017-18741-9

    Article  Google Scholar 

Download references

Acknowledgements

The current research was approved by Tianjin University’s local Ethics Committee. The authors extend their gratitude to all participants for their support, and Editage company for their help on proof reading.

Funding

This research is supported by the Natural Science Foundation of Tianjin City (Grant No. 19JCYBJC19500) and the National Natural Science Foundation of China (Grant No. 51875399).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yanqun Huang.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (MP4 680 kb)

Supplementary file2 (MP4 800 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xue, J., Huang, Y., Li, X. et al. Emotional Influence of Pupillary Changes of Robots with Different Human-Likeness Levels on Human. Int J of Soc Robotics 14, 1687–1696 (2022). https://doi.org/10.1007/s12369-022-00903-z

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-022-00903-z

Keywords

Navigation