Skip to main content
Log in

The Effects of Humanlike and Robot-Specific Affective Nonverbal Behavior on Perception, Emotion, and Behavior

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Research demonstrated that humans are able to interpret humanlike (affective) nonverbal behavior (HNB) in artificial entities (e.g. Beck et al., in: Proceedings of the 19th IEEE international symposium on robot and human interactive communication, IEEE Press, Piscataway, 2010. https://doi.org/10.1109/ROMAN.2010.5598649; Bente et al. in J Nonverbal Behav 25: 151–166, 2001; Mumm and Mutlu, in: Proceedings of the 6th international conference on human–robot interaction, HRI. ACM Press, New York, 2011. https://doi.org/10.1145/1957656.1957786). However, some robots lack the possibility to produce HNB. Using robot-specific nonverbal behavior (RNB) such as different eye colors to convey emotional meaning might be a fruitful mechanism to enhance HRI experiences, but it is unclear whether RNB is as effective as HNB. We present a review on affective nonverbal behaviors in robots and an experimental study. We experimentally tested the influence of HNB and RNB (colored LEDs) on users’ perception of the robot (e.g. likeability, animacy), their emotional experience, and self-disclosure. In a between-subjects design, users (\(n=80\)) interacted with either (a) a robot displaying no nonverbal behavior, (b) a robot displaying affective RNB, (c) a robot displaying affective HNB or (d) a robot displaying affective HNB and RNB. Results show that HNB, but not RNB, has a significant effect on the perceived animacy of the robot, participants’ emotional state, and self-disclosure. However, RNB still slightly influenced participants’ perception, emotion, and behavior: Planned contrasts revealed having any type of nonverbal behavior significantly increased perceived animacy, positive affect, and self-disclosure. Moreover, observed linear trends indicate that the effects increased with the addition of nonverbal behaviors (control< RNB< HNB). In combination, our results suggest that HNB is more effective in transporting the robot’s communicative message than RNB.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1

Similar content being viewed by others

References

  1. Banse R, Scherer KR (1996) Acoustic profiles in vocal emotion expression. J Pers Soc Psychol 70:614–636

    Article  Google Scholar 

  2. Bartneck C, Kulić D, Croft E, Zoghbi S (2009) Measurement Instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int J Soc Robot 1:71–81

    Article  Google Scholar 

  3. Beck A, Canamero L, Bard KA (2010) Towards an affect space for robots to display emotional body language. In: Proceedings of the 19th IEEE international symposium on robot and human interactive communication, RO-MAN. IEEE Press, Piscataway, New Jersey, pp 464–469. https://doi.org/10.1109/ROMAN.2010.5598649

  4. Beck A, Cañamero L, Hiolle A, Damiano L, Cosi P, Tesser F, Sommavilla G (2013) Interpretation of emotional body language displayed by a humanoid robot: a case study with children. Int J Soc Robot 5:325–334

    Article  Google Scholar 

  5. Becker-Asano C, Ishiguro H (2011) Evaluating facial displays of emotion for the android robot Geminoid F. In: 2011 IEEE workshop on affective computational intelligence. Piscataway, New Jersey, pp 1–8. https://doi.org/10.1109/WACI.2011.5953147

  6. Bente G, Krämer NC (2003) Integrierte Registrierung und Analyse verbaler und nonverbaler Kommunikation. In: Herrmann T, Grabowski, J (ed) Sprachproduktion. Enzyklopädie der Psychologie. Themenbereich C Serie 3, Band 1. Hogrefe, Göttingen, pp 219–246

  7. Bente G, Krämer NC, Petersen A, de Ruiter JP (2001) Computer animated movement and person perception: methodological advances in nonverbal behavior research. J Nonverbal Behav 25:151–166

    Article  Google Scholar 

  8. Breazeal C, Kidd C, Thomaz A, Hoffman G, Berlin M (2005) Effects of nonverbal communication on efficiency and robustness in human–robot teamwork. In: Proceedings of the IEEE/RSJ international conference on intelligent robots and systems, IROS, pp 708–713. https://doi.org/10.1109/IROS.2005.1545011

  9. Burgoon JK, Bacue AE (2003) Nonverbal communication skills. In: Greene JO, Burleson BR (eds) Handbook of communication and social interaction skills. LEA’s communication series. Lawrence Erlbaum Associates, Mahwah, pp 179–220

    Google Scholar 

  10. Burgoon JK, Guerrero LK, Manusov V (2011) Nonverbal signals. In: Knapp ML, Daly JA (eds) The sage handbook of interpersonal communication. Sage, Thousand Oaks, pp 239–282

    Google Scholar 

  11. Cicchetti DV (1994) Guidelines, criteria, and rules of thumb for evaluating normed and standardized assessment instruments in psychology. Psychol Assess 6:284–290

    Article  Google Scholar 

  12. Collins EC, Prescott TJ, Mitchinson B (2015) Saying it with light: a pilot study of affective communication using the MIRO robot. In: Wilson SP, Verschure PF, Mura A, Prescott TJ (eds) Biomimetic and biohybrid systems. Lecture notes in computer science. Springer, Cham, pp 243–255. https://doi.org/10.1007/978-3-319-22979-9_25

    Chapter  Google Scholar 

  13. Dael N, Mortillaro M, Scherer KR (2012) Emotion expression in body action and posture. Emotion 12:1085–1101

    Article  Google Scholar 

  14. Ekman P (1993) Facial expression and emotion. Am Psychol 48:384–392

    Article  Google Scholar 

  15. Embgen S, Luber M, Becker-Asano C, Ragni M, Evers V, Arras KO (2012) Robot-specific social cues in emotional body language. In: The 21st IEEE international symposium on robot and human interactive communication, RO-MAN, pp 1019–1025. https://doi.org/10.1109/ROMAN.2012.6343883

  16. Häring M, Bee N, André E (2011) Creation and Evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In: The 20th IEEE international symposium on robot and human interactive communication, RO-MAN, pp 204–209. https://doi.org/10.1109/ROMAN.2011.6005263

  17. Hurlbert AC, Ling Y (2007) Biological components of sex differences in color preference. Curr Biol 17:R623

    Article  Google Scholar 

  18. Johnson DO, Cuijpers RH, van der Pol D (2013) Imitating human emotions with artificial facial expressions. Int J Soc Robot 5:503–513

    Article  Google Scholar 

  19. Kang SH, Gratch J (2010) Virtual humans elicit socially anxious interactants’ verbal self-disclosure. Comput Animat Virt W 21:473–482

    Google Scholar 

  20. Kishi T, Endo N, Nozawa T et al. (2010) Bipedal humanoid robot that makes humans laugh with use of the method of comedy and affects their psychological state actively. In: Proceedings of the IEEE international conference on robotics and automation (ICRA’10), pp 1965–1970

  21. Krämer NC, Kopp S, Becker-Asano C, Sommer N (2013) Smile and the world will smile with you—the effects of a virtual agent’s smile on users’ evaluation and behavior. Int J Hum Comput St 71:335–349

    Article  Google Scholar 

  22. Leite I, Martinho C, Pereira A, Paiva A (2008) iCat: an affective game buddy based on anticipatory mechanisms. In: Proceedings of the 7th international conference on autonomous agents and multiagent systems, AAMAS. Estoril, Portugal, pp 1229–1232

  23. Leite I, Mascarenhas S, Pereira A, Martinho C, Prada R, Paiva A (2010) ”Why can’t we be friends?” an empathic game companion for long-term interaction. In: Hutchison D et al (eds) Intelligent virtual agents. Lecture notes in computer science. Springer, Berlin, pp 215–321. https://doi.org/10.1007/978-3-642-15892-6_32

    Chapter  Google Scholar 

  24. Li J, Chignell M (2011) Communication of emotion in social robots through simple head and arm movements. Int J Soc Robot 3:125–142

    Article  Google Scholar 

  25. Manav B (2007) Color-emotion associations and color preferences: a case study for residences. Color Res App 32:144–150

    Article  Google Scholar 

  26. Manstead ASR, Fischer AH, Jakobs EB (1999) The social and emotional functions of facial displays. In: Phillipot P, Feldman RS, Coats EJ (eds) The social context of nonverbal behavior. Cambridge University Press, Cambridge, pp 287–316

    Google Scholar 

  27. McGraw KO, Wong SP (1996) Forming inferences about some intraclass correlation coefficients. Psychol Methods 1:30–46

    Article  Google Scholar 

  28. Mumm J, Mutlu B (2011) Human–robot proxemics. In: Proceedings of the 6th international conference on human–robot interaction, HRI. ACM Press, New York, USA. https://doi.org/10.1145/1957656.1957786

  29. Mutlu B, Shiwa T, Kanda T, Ishiguro H, Hagita N (2009) Footing in human-robot conversations. In: Proceedings of the 4th ACM/IEEE international conference on human–robot interaction. ACM Press, New York, p 61. https://doi.org/10.1145/1514095.1514109

  30. Mutlu B, Yamaoka F, Kanda T, Ishiguro H, Hagita N (2009) Nonverbal leakage in robots. In: Proceedings of the 4th ACM/IEEE international conference on human–robot interaction. ACM Press, New York, 69 p. https://doi.org/10.1145/1514095.1514110

  31. Nass C, Moon Y (2000) Machines and mindlessness: social responses to computers. J Soc Issues 56:81–103

    Article  Google Scholar 

  32. Nomura T, Suzuki T, Kanda T et al (2006) Measurement of negative attitudes toward robots. Interact Stud 7(3):437–454. https://doi.org/10.1075/is.7.3.14nom

    Article  Google Scholar 

  33. Nomura T, Suzuki T, Kanda T et al. (2007) Measurement of anxiety toward robots. In: Proceedings of the 16th IEEE international conference on robot and human interactive communication. IEEE Press; IEEE, Piscataway, NJ, pp 372–377

  34. Pereira A, Leite I, Mascarenhas S, Martinho C, Paiva A (2011) Using empathy to improve human-robot relationships. In: Akan O, Bellavista P, Cao J, Dressler F, Ferrari D, Gerla M, Kobayashi H, Pallazo S, Sahni S, Shen X, Stan M, Xiaohua J, Zomaya A, Coulson G, Lamers MH, Verbeek FJ (eds) Human-robot personal relationships. Lecture notes of the institute for computer sciences, social informatics and telecommunications engineering. Springer, Berlin, pp 130–138. https://doi.org/10.1007/978-3-642-19385-9_17

    Chapter  Google Scholar 

  35. Press C (2011) Action observation and robotic agents: learning and anthropomorphism. Neurosci Biobehav R 35:1410–1418

    Article  Google Scholar 

  36. Rosenthal-von der Pütten AM, Krämer NC, Hoffmann L, Sobieraj S, Eimler SC (2013) An experimental study on emotional reactions towards a robot. Int J Soc Robot 5:17–34

    Article  Google Scholar 

  37. Rosenthal-von der Pütten AM, Schulte FP, Eimler SC, Sobieraj S, Hoffmann L, Maderwald S, Brand M, Krämer NC (2014) Investigations on empathy towards humans and robots using fMRI. Comput Hum Behav 33:201–212.https://doi.org/10.1016/j.chb.2014.01.004

    Article  Google Scholar 

  38. Salem M, Eyssel F, Rohlfing K, Kopp S, Joublin F (2011) Effects of gesture on the perception of psychological anthropomorphism: a case study with a humanoid robot. In: Hutchison D et al (eds) Social robotics. Lecture notes in computer science. Springer, Berlin, pp 31–41. https://doi.org/10.1007/978-3-642-25504-5_4

    Chapter  Google Scholar 

  39. Scheutz M, Schermerhorn P, Kramer J (2006) The utility of affect expression in natural language interactions in joint human–robot tasks. In: Proceedings of the 1st ACM SIGCHI/SIGART conference on human robot interaction, HRI, https://doi.org/10.1145/1121241.1121281

  40. Suzuki Y, Galli L, Ikeda A et al (2015) Measuring empathy for human and robot hand pain using electroencephalography. Sci Rep 5:15924

    Article  Google Scholar 

  41. Terada K, Yamauchi A, Ito A (2012) Artificial emotion expression for a robot by dynamic color change. In: The 21st IEEE international symposium on robot and human interactive communication, RO-MAN, pp 314–321. https://doi.org/10.1109/ROMAN.2012.6343772

  42. Tsai J, Bowring E, Marsella S, Wood W, Tambe M (2012) A study of emotional contagion with virtual characters. In: Proceedings of the 12th international workshop on intelligent virtual agents 7502, IVA. Springer, Berlin, pp 81–88. https://doi.org/10.1007/978-3-642-33197-8_8

    Chapter  Google Scholar 

  43. Valdez P, Mehrabian A (1994) Effects of color on emotions. J Exp Psychol Gen 123:394–409

    Article  Google Scholar 

  44. von der Pütten AM, Klatt J, Hoffmann L, Krämer NC (2011) Quid pro quo? Reciprocal self-disclosure and communicative accomodation towards a virtual interviewer. In: Lecture notes in computer science 6895. Springer, Berlin, pp. 183–194

  45. von der Pütten AM, Krämer NC, Gratch J, Kang S-H (2010) It doesn’t matter what you are! Explaining social effects of agents and avatars. Comput Hum Behav 26:1641–1650

    Article  Google Scholar 

  46. Wallbott HG (1988) In and out of context: influences of facial expression and context information on emotion attributions. Brit J Soc Psychol 27:357–369

    Article  Google Scholar 

  47. Watson D, Tellegen A, Clark LA (1988) Development and validation of brief measure of positive and negative affect: the PANAS scales. J Pers Soc Psychol 54:1063–1070

    Article  Google Scholar 

  48. Wu Y, Babu SV, Armstrong R, Bertrand JW, Luo J, Roy T, Daily SB, Dukes LC, Hodges LF, Fasolino T (2014) Effects of virtual human animation on emotion contagion in simulated inter-personal experiences. IEEE T Vis Comput Gr 20:626–635

    Article  Google Scholar 

  49. Xu J, Broekens J, Hindriks K, Neerincx MA (2014) Robot mood is contagious: effects of robot body language in the imitation game. In: Proceedings of the 2014 international conference on autonomous agents and multi-agent systems. International foundation for autonomous agents and multiagent systems, Paris, France, pp 973–980

  50. Zatsiorsky VM, Prilutsky BI (2012) Biomechanics of skeletal muscles. Human Kinetics, Champaign

    Google Scholar 

  51. Zecca M, Endo N, Momoki S, Itoh K, Takanishi A (2008) Design of the humanoid robot KOBIAN -preliminary analysis of facial and whole body emotion expression capabilities-. In: 2008 8th IEEE-RAS international conference on humanoid robots, Humanoids, pp 487–492. https://doi.org/10.1109/ICHR.2008.4755969

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Astrid M. Rosenthal-von der Pütten.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rosenthal-von der Pütten, A.M., Krämer, N.C. & Herrmann, J. The Effects of Humanlike and Robot-Specific Affective Nonverbal Behavior on Perception, Emotion, and Behavior. Int J of Soc Robotics 10, 569–582 (2018). https://doi.org/10.1007/s12369-018-0466-7

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-018-0466-7

Keywords

Navigation