Skip to main content
Log in

Do Emotional Robots Get More Help? How a Robots Emotions Affect Collaborators Willingness to Help

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

This study explored the potential of artificial emotions displayed by a robot to enhance communication and increase human willingness to assist in situations where the robot is faced with a task it cannot accomplish. Using a process-oriented approach, emotions were viewed as an integral part of the complex dynamics between individuals and their environment, facilitating social cues for coordinated actions. In the first study, participants were shown videos of a robot showing no emotion, as well as sad or angry emotions following a failed task. Participants accurately identified the artificial emotions, and the results indicated that displaying emotions improved overall understanding of the robot's situation. However, it had no significant effect on participants' willingness to help. The second study focused on the robot's role as a collaborator. Participants watched the same videos as in the first study. The results revealed that, on the whole, participants preferred a neutral robot as their collaborator, and showed a particularly strong aversion to the angry robot. While the sad robot increased participants' willingness to help, the study suggests that careful selection of artificial emotions is crucial, taking into account situational appropriateness and the emotional impact on human collaborators. This acknowledges the existence of an affective loop between the robot's artificial emotion and its human counterpart. Overall, this research highlights the potential importance of artificial emotions in human–robot interactions, emphasizes the need for careful consideration when incorporating such emotions, and recognizes the complex interplay between a robot's emotional expression and its impact on human collaborators.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Data availability

The datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.

Notes

  1. https://www.digitaldreamlabs.com/pages/cozmo

References

  1. Matheson E et al (2019) Human–robot collaboration in manufacturing applications: a review. Robotics 8(4):100

    Article  Google Scholar 

  2. Blaga A, Tamas L (2018) Augmented reality for digital manufacturing. In: 2018 26th Mediterranean Conference on Control and Automation (MED) (pp. 173-178). IEEE

  3. Makris S et al (2016) Augmented reality system for operator support in human–robot collaborative assembly. CIRP Ann 65(1):61–64

    Article  Google Scholar 

  4. Lovasz E-C et al (2017) Design and control solutions for haptic elbow exoskeleton module used in space telerobotics. Mech Mach Theory 107:384–398

    Article  Google Scholar 

  5. Seyitoğlu F et al (2021) Robots as restaurant employees-A double-barrelled detective story. Technol Soc 67:101779

    Article  Google Scholar 

  6. Kupetz M (2014) Empathy display as interactinal achievements - Multimodal and sequential aspects. J Pragmat 61:4–34

    Article  Google Scholar 

  7. Zaki J, Williams WC (2013) Interpersonal emotion regulation. Emotion 13(5):803–810

    Article  Google Scholar 

  8. Butler EA (2015) Interpersonal affect dynamics: It takes two (and time) to tango. Emot Rev 7(4):336–341

    Article  Google Scholar 

  9. Kirby R, Forlizzi J, Simmons R (2010) Affective social robots. Robot Auton Syst 58(3):322–332

    Article  Google Scholar 

  10. Chang WL, Šabanovic S, Huber L (2013) Use of seal-like robot PARO in sensory group therapy for older adults with dementia. In: 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI). 2013. IEEE

  11. Rosenberg-Kima R et al (2019) Human-Robot-Collaboration (HRC): social robots as teaching assistants for training activities in small groups. In: 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). 2019. IEEE

  12. de Kervenoael R et al (2020) Leveraging human-robot interaction in hospitality services: Incorporating the role of perceived value, empathy, and information sharing into visitors’ intentions to use social robots. Tour Manage 78:104042

    Article  Google Scholar 

  13. Wykowska A, Chaminade T, Cheng G (2016) Embodied artificial agents for understanding human social cognition. Philosoph Trans Royal Soc B: Biol Sci 371(1693):20150375

    Article  Google Scholar 

  14. Jerčić P et al (2018) The effect of emotions and social behavior on performance in a collaborative serious game between humans and autonomous robots. Int J Soc Robot 10(1):115–129

    Article  Google Scholar 

  15. Onnasch L, Roesler E (2019) Anthropomorphizing robots: The effect of framing in human-robot collaboration. In: Proceedings of the human factors and ergonomics Society annual meeting (Vol. 63, No. 1, pp 1311-1315). Sage CA: Los Angeles, CA: SAGE Publications

  16. Gleeson B, MacLean K, Haddadi A, Croft E, Alcazar J (2013) Gestures for industry intuitive human-robot communication from human observation. In: 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp 349-356)

  17. Mavridis N (2015) A review of verbal and non-verbal human–robot interactive communication. Robot Auton Syst 63:22–35

    Article  MathSciNet  Google Scholar 

  18. Sebanz N, Bekkering H, Knoblich G (2006) Joint action: bodies and minds moving together. Trends Cogn Sci 10(2):70–76

    Article  Google Scholar 

  19. Bauer A, Wollherr D, Buss M (2008) Human–robot collaboration: a survey. Int J Humanoid Rob 5(01):47–66

    Article  Google Scholar 

  20. Ajoudani A et al (2018) Progress and prospects of the human–robot collaboration. Auton Robot 42(5):957–975

    Article  Google Scholar 

  21. Novikova J, Watts L (2015) Towards artificial emotions to assist social coordination in HRI. Int J Soc Robot 7(1):77–88

    Article  Google Scholar 

  22. Stock-Homburg R, (2021) Survey of emotions in human–robot interactions: perspectives from robotic psychology on 20 years of research. Int J Soc Robot p 1–23

  23. Weis PP, Herbert C (2022) Do I still like myself? Human-robot collaboration entails emotional consequences. Comput Hum Behav 127:107060

    Article  Google Scholar 

  24. Smith EE (2007) Cognitive psychology: Mind and brain, Upper Saddle River, NJ: Pearson.

  25. Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161

    Article  Google Scholar 

  26. Butler EA (2011) Temporal interpersonal emotion systems: The “TIES” that form relationships. Pers Soc Psychol Rev 15(4):367–393

    Article  Google Scholar 

  27. Cuff BMP et al (2016) Empathy: a review of the concept. Emot Rev 8(2):144–153

    Article  MathSciNet  Google Scholar 

  28. Davis MH, Empathy: A social psychological approach. Social psychology series. (1994) Boulder. Westview Press, CO

    Google Scholar 

  29. Zaki J (2014) Empathy: a motivated account. Psychol Bull 140(6):1608–1647

    Article  Google Scholar 

  30. Main A et al (2017) The interpersonal functions of empathy: A relational perspective. Emot Rev 9(4):358–366

    Article  Google Scholar 

  31. Tisseron S, Tordo F, Baddoura R (2015) Testing empathy with robots: A model in four dimensions and sixteen items. Int J Soc Robot 7(1):97–102

    Article  Google Scholar 

  32. Damiano L, Dumouchel P, Lehmann H (2015) Towards human–robot affective co-evolution overcoming oppositions in constructing emotions and empathy. Int J Soc Robot 7(1):7–18

    Article  Google Scholar 

  33. Fischer K, Jung M, Jensen LC, aus der Wieschen MV (2019) Emotion expression in HRI–when and why. In: 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp 29-38)

  34. Read R, Belpaeme T (2014) Situational context directs how people affectively interpret robotic non-linguistic utterances. In: Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction (pp 41-48)

  35. Vinciarelli A, Pantic M, Bourlard H (2009) Social signal processing: Survey of an emerging domain. Image Vis Comput 27(12):1743–1759

    Article  Google Scholar 

  36. Vinciarelli A et al (2011) Bridging the gap between social animal and unsocial machine: a survey of social signal processing. IEEE Trans Affect Comput 3(1):69–87

    Article  Google Scholar 

  37. Aylett R et al (2019) An architecture for emotional facial expressions as social signals. IEEE Trans Affect Comput 12(2):293–305

    Article  Google Scholar 

  38. Joo H, Simon T, Cikara M, Sheikh Y (2019) Towards social artificial intelligence: Nonverbal social signal prediction in a triadic interaction. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp 10873-10883)

  39. Cristani M et al (2013) Human behavior analysis in video surveillance: a social signal processing perspective. Neurocomputing 100:86–97

    Article  Google Scholar 

  40. Mehrabian A, Russell JA (1974) An approach to environmental psychology. The MIT Press.

  41. Song S, Yamada S (2017) Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. In: 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI IEEE)

  42. Bennett CC, Šabanović S (2014) Deriving minimal features for human-like facial expressions in robotic faces. Int J Soc Robot 6(3):367–381

    Article  Google Scholar 

  43. McColl D, Nejat G (2014) Recognizing emotional body language displayed by a human-like social robot. Int J Soc Robot 6(2):261–280

    Article  Google Scholar 

  44. English BA, Coates A, Howard A (2017) Recognition of gestural behaviors expressed by humanoid robotic platforms for teaching affect recognition to children with autism-a healthy subjects pilot study. In: International Conference on Social Robotics, Springer

  45. Urakami J, Sutthithatip S (2021) Building a Collaborative relationship between human and robot through verbal and non-verbal interaction. In: Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction

  46. Lehmann H, Broz F (2018) Contagious yawning in human-robot interaction. In: Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction

  47. Urakami J et al. (2019) Users' perception of empathic expressions by an advanced intelligent system, in HAI: Kyoto

  48. Vircikova M, Magyar G, Sincak P (2015) The affective loop: A tool for autonomous and adaptive emotional human-robot interaction. Robot intelligence technology and applications 3. Springer, pp 247–254

    Chapter  Google Scholar 

  49. Johal W, Pellier D, Adam C, Fiorino H, Pesty S (2015) A cognitive and affective architecture for social human-robot interaction. In: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts (pp 71-72)

  50. Rosenthal-von der Pütten AM, Krämer NC, Herrmann J (2018) The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. Int J Soc Robot 10:569–582

    Article  Google Scholar 

  51. Bourguet ML, Xu M, Zhang S, Urakami J, Venture G (2020) The impact of a social robot public speaker on audience attention. In: Proceedings of the 8th International Conference on Human-Agent Interaction (pp 60-68)

  52. Marmpena M, et al (2019) Generating robotic emotional body language with variational autoencoders. In: 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII). 2019. IEEE

  53. Nomura T, Kanda T, Suzuki T (2006) Experimental investigation into influence of negative attitudes toward robots on human–robot interaction. AI Soc 20(2):138–150

    Article  Google Scholar 

  54. Chomeya R (2010) Quality of psychology test between Likert scale 5 and 6 points. J Soc Sci 6(3):399–403

    Google Scholar 

  55. Nadler JT, Weston R, Voyles EC (2015) Stuck in the middle: the use and interpretation of mid-points in items on questionnaires. J Gen Psychol 142(2):71–89

    Article  Google Scholar 

  56. Leung S-O (2011) A comparison of psychometric properties and normality in 4-, 5-, 6-, and 11-point Likert scales. J Soc Serv Res 37(4):412–421

    Article  Google Scholar 

  57. Matell MS, Jacoby J (1972) Is there an optimal number of alternatives for Likert-scale items? effects of testing time and scale properties. J Appl Psychol 56(6):506

    Article  Google Scholar 

  58. Taber KS (2018) The use of Cronbach’s alpha when developing and reporting research instruments in science education. Res Sci Educ 48(6):1273–1296

    Article  Google Scholar 

  59. Bradley MM, Lang PJ (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatry 25(1):49–59

    Article  Google Scholar 

  60. Smyth JD, Christian LM, Dillman DA (2008) Does “yes or no” on the telephone mean the same as “check-all-that-apply” on the web? Public Opin Q 72(1):103–113

    Article  Google Scholar 

  61. Smyth JD et al (2006) Comparing check-all and forced-choice question formats in web surveys. Public Opin Q 70(1):66–77

    Article  Google Scholar 

  62. Sudman S, Bradburn NM (1982) Asking questions: a practical guide to questionnaire design. Jossey-Bass

  63. Abebe TH (2019) The Derivation and choice of appropriate test statistic (z, t, f and chi-square test) in research methodology. J Math Lett 5(3):33–40

    Article  Google Scholar 

  64. Bortz Jg, Lienert AL, Boehnke K (2000) Verteilungsfreie Methoden in der Biostatistik. Berlin: Springer

  65. Urakami J, Sutthithatip S, Moore BA (2020) The effect of naturalness of voice and empathic responses on enjoyment, attitudes and motivation for interacting with a voice user interface. In: Human-Computer Interaction. Multimodal and Natural Interaction: Thematic Area, HCI 2020, Held as Part of the 22nd International Conference, HCII 2020, Copenhagen, Denmark, July 19–24, 2020, Proceedings, Part II 22 (pp 244-259). Springer International Publishing

  66. Leite I, Martinho C, Paiva A (2013) Social robots for long-term interaction: a survey. Int J Soc Robot 5(2):291–308

    Article  Google Scholar 

  67. Smedegaard CV (2019) Reframing the role of novelty within social HRI: from noise to information. In: 2019 14th acm/ieee international conference on human-robot interaction (hri) (pp 411-420). IEEE

  68. Abendschein B, Edwards A, Edwards C (2022) Novelty experience in prolonged interaction: a qualitative study of socially-isolated college students’ in-home use of a robot companion animal. Front Robot AI 9:733078

    Article  Google Scholar 

Download references

Acknowledgements

Many thanks to Sujitra Sutthithatip who helped with the preparation and conduction of this research. No grants were received for this research.

Funding

No funding was received to assist with the preparation of this manuscript. The authors have no relevant financial or non-financial interests to disclose.

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed to the study conception and design. Material preparation, data collection and analysis were performed by all authors. The first draft of the manuscript was written by Jacqueline Urakami and all authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Jacqueline Urakami.

Ethics declarations

Conflict of interest

The author confirms that there are no potential conflicts of interest.

Ethical approval

The study was approved by the Ethical review board at Tokyo University of Technology, approval number 2018135.

Human and animal rights statement

This research involves human participants.

Informed consent

Informed consent was obtained from all participants.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Urakami, J. Do Emotional Robots Get More Help? How a Robots Emotions Affect Collaborators Willingness to Help. Int J of Soc Robotics 15, 1457–1471 (2023). https://doi.org/10.1007/s12369-023-01058-1

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-023-01058-1

Keywords

Navigation