Skip to main content
Log in

Correlation Analysis for Predictive Models of Robot User’s Impression: A Study on Visual Medium and Mechanical Noise

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Service robots are increasingly common. Studies show that the appearance and behavior of robots influence users’ impressions. Developers need to know whether users’ impressions of robots are in accordance with the purpose they designed or not. To assess such impressions, it is necessary for participants to meet robots in a real-world setting. Such studies are costly and take a lot of time. On the other hand, experiments are now extremely easy to perform over the internet. However, there is no evidence that impressions obtained from a video recording or a movie of a simulated avatar are comparable to an impression obtained in a real-world setting. In this study, we hypothesized that there are trade-offs between ease of collecting impressions and the real-world applicability of said impressions. We tried to use impressions obtained in “Recorded” and “Avatar” settings to predict the impressions in a “Real” setting by correlation analysis. In addition, we also performed muted real-world setting “Soundproof” to evaluate the influence of motor noise of robot motion for user’s impressions. In the experiment, two kinds of humanoid robots performed five kinds of motions. Participants gave their impressions in quantitative form under four different conditions: real-world, video recording, video avatar and muted real-world. Our study takes into account the effects of motor noise in addition to the medium on which the robot is seen. Our results show correlations between the “Soundproof”, “Recorded”, and “Avatar” settings. We found that motor noise affects participants’ impressions of the robot and that some trade-off relationships exist between the different conditions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Breazeal C (2004) Function meets style: insights from emotion theory applied to HRI. IEEE Trans Syst Man Cybern Part C Appl Rev 34(2):187–194

    Article  Google Scholar 

  2. Arkin RC et al (2003) An ethological and emotional basis for human-robot interaction. Robot Autonom Syst 42(3–4):191–201

    Article  MATH  Google Scholar 

  3. Zecca M et al (2009) Whole body emotion expressions for KOBIAN humanoid robot—preliminary experiments with different Emotional patterns. In: The 18th IEEE international symposium on robot and human interactive communication, 2009. RO-MAN 2009. IEEE, pp 381–386

  4. Kaneko K et al (2011) Hardware improvement of cybernetic human HRP-4C for entertainment use. In: 2011 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 4392–4399

  5. Yuk N-S, Kwon D-S (2008) Realization of expressive body motion using leg-wheel hybrid mobile robot: KaMERo1. In: International conference on control, automation and systems, 2008. ICCAS 2008. IEEE, pp 2350–2355

  6. Burgard W et al (1998) The interactive museum tour-guide robot. In: AAAI/IAAI, pp 11–18

  7. Scheutz M (2013) What is robot ethics? [TC spotlight]. IEEE Robot Autom Mag 20(4):20–165

    Article  Google Scholar 

  8. Vargas MF (1986) Louder than words: an introduction to nonverbal communication. Iowa State Press, Iowa

    Google Scholar 

  9. Salovey P, Mayer JD (1990) Emotional intelligence. Imagin Cognit Personal 9(3):185–211

    Article  Google Scholar 

  10. Bernotat J, Eyssel FA (2017) A robot at home—how affect, technology commitment, and personality traits influence user experience in an intelligent robotics apartment. In: Proceedings of the 26th IEEE international symposium on robot and human interactive communication (RO-MAN)

  11. Anzalone SM et al (2015) Evaluating the engagement with social robots. Int J Soc Robot 7(4):465–478

    Article  Google Scholar 

  12. Chen TL et al (2014) An investigation of responses to robot-initiated touch in a nursing context. Int J Soc Robot 6(1):141–161

    Article  Google Scholar 

  13. Izui T, Venture G (2017) Impression’s predictive models for animated robot. In: 2017 26th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE, pp 621–626

  14. Kamide H et al (2012) New measurement of psychological safety for humanoid. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction. ACM, pp 49–56

  15. Wagemaker E et al (2017) Advances in mental health care: five N = 1 studies on the effects of the robot seal paro in adults with severe intellectual disabilities. J Ment Health Res Intell Disabil 10(4):309–320

    Article  Google Scholar 

  16. Venture G, Indurkhya B, Izui T (2017) Dance with me! Child–robot interaction in the wild. In: International conference on social robotics. Springer, Cham, pp 375–382

  17. Pateromichelakis N et al (2014) Head-eyes system and gaze analysis of the humanoid robot Romeo. In: 2014 IEEE/RSJ international conference on intelligent robots and systems (IROS 2014). IEEE, pp 1374–1379

  18. Erden MS (2013) Emotional postures for the humanoid-robot NAO. Int J Soc Robot 5(4):441–456

    Article  Google Scholar 

  19. Claret J-A, Venture G, Basañez L (2017) Exploiting the robot kinematic redundancy for emotion conveyance to humans as a lower priority task. Int J Soc Robot 9(2):277–292

    Article  Google Scholar 

  20. Kamide H et al (2014) Nonverbal behaviors toward an audience and a screen for a presentation by a humanoid robot. Artif Intell Res 3(2):57

    Article  Google Scholar 

  21. Kasuga H et al (2017) A social robot in a human–animal relationship at home: a field study. In: Proceedings of the 5th international conference on human–agent interaction. ACM, pp 61–69

  22. Lee WH et al (2014) Motivational emotion generation and behavior selection based on emotional experiences for social robots. In: Workshops in ICSR 2014

  23. Dubois M et al (2016) Influence of emotional motions in human–robot interactions. In: International symposium on experimental robotics. Springer, Cham, pp 799–808

  24. Yamashita Y et al (2017) Appearance of a robot influences causal relationship between touch sensation and the personality impression. In: Proceedings of the 5th international conference on human–agent interaction. ACM, pp 457–461

  25. Wu X et al (2017) An evaluation of a telepresence robot: user testing among older adults with mobility impairment. In: Proceedings of the companion of the 2017 ACM/IEEE international conference on human–robot interaction. ACM, pp 325–326

  26. Baisch S et al (2017) Acceptance of social robots by elder people: does psychosocial functioning matter? Int J Soc Robot 9(2):293–307

    Article  Google Scholar 

  27. Woods S et al (2006) Comparing human robot interaction scenarios using live and video based methods: towards a novel methodological approach. In: 9th IEEE international workshop on advanced motion control, 2006. IEEE, pp 750–755

  28. Bonanni L, Ishii H (2009) Stop-motion prototyping for tangible interfaces. In: Proceedings of the 3rd international conference on tangible and embedded interaction. ACM, pp 315–316

  29. Mehrabian A (1996) Pleasure-arousal-dominance: a general framework for describing and measuring individual differences in temperament. Curr Psychol 14(4):261–292

    Article  Google Scholar 

  30. Sakairi Y, Nakatsuka K, Shimizu T (2013) Development of the Two-Dimensional Mood Scale for self-monitoring and self-regulation of momentary mood states. Jpn Psychol Res 55(4):338–349

    Google Scholar 

  31. Serenko A (2007) The development of an instrument to measure the degree of animation predisposition of agent users. Comput Hum Behav 23(1):478–495

    Article  Google Scholar 

  32. Izui T et al (2015) Expressing emotions using gait of humanoid robot. In: 2015 24th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE, pp 241–245

  33. Aarestrup M, Jensen LC, Fischer K (2015) The sound makes the greeting: interpersonal functions of intonation in human–robot interaction. In: 2015 AAAI spring symposium series

  34. Ivaldi S et al (2017) Towards engagement models that consider individual factors in HRI: on the relation of extroversion and negative attitude towards robots to gaze and speech during a human-robot assembly task. Int J Soc Robot 9(1):63–86

    Article  Google Scholar 

  35. Abel SM, Odell P (2006) Sound attenuation from earmuffs and earplugs in combination: maximum benefits vs. missed information. Aviat Space Environ Med 77(9):899–904

    Google Scholar 

  36. Graham-Rowe E et al (2012) Mainstream consumers driving plug-in battery-electric and plug-in hybrid electric cars: a qualitative analysis of responses and evaluations. Transp Res Part A Policy Pract 46(1):140–153

    Article  Google Scholar 

  37. Kawachi N et al (2003) Home-use robot “wakamaru”. Mitsubishi Juko Giho 40(5):270–273

    Google Scholar 

  38. El Haddad K (2017) Nonverbal conversation expressions processing for human–agent interactions. In: 2017 seventh international conference on affective computing and intelligent interaction (ACII). IEEE, pp 601–605

  39. Stein J-P, Ohler P (2017) Venturing into the uncanny valley of mind–the influence of mind attribution on the acceptance of human-like characters in a virtual reality setting. Cognition 160:43–50

    Article  Google Scholar 

Download references

Funding

Not applicable.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Takamune Izui.

Ethics declarations

Conflict of interest

The authors declare that they have no competing interests.

Availability of Data and Materials

Available.

Consent for Publication

Not applicable.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Izui, T., Venture, G. Correlation Analysis for Predictive Models of Robot User’s Impression: A Study on Visual Medium and Mechanical Noise. Int J of Soc Robotics 12, 425–439 (2020). https://doi.org/10.1007/s12369-019-00601-3

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-019-00601-3

Keywords

Navigation