Skip to main content
Log in

Analysis of impressions of robot by changing its motion and trajectory parameters for designing parameterized behaviors of home-service robots

  • Original Research Paper
  • Published:
Intelligent Service Robotics Aims and scope Submit manuscript

Abstract

In this study, we analyzed human impressions of the motions and trajectories of a robot, aiming to design appropriate motions for home-service robots, such as cleaning robots. General communication robots have various expression modalities, such as facial expressions and gestures. These modalities are the same as those of humans and are easy to understand by humans; however, they cannot be incorporated in cleaning robots because of their shape limitations. Thus, we aimed to employ the motions and trajectories of a robot as its expression modalities and analyze the differences in human impressions based on the differences in the motions of the robot. Based on conventional research, we focused on four parameters: difference in the speed (acceleration/deceleration), difference in the direction (curvature), distance between the robot and a target (a participant or an obstacle), and duration of the pausing action. To analyze the effects of the differences in the motions of the robot on human impressions, we conducted experiments based on the semantic differential method. The analysis of the results yielded four factors: ability, comfort, activity, and anthropomorphism. After calculating the factor scores, we conducted an analysis of variance for each motion condition. The results showed three tendencies of the impression changes based on the differences in the motions of the robot: effects of the acceleration/deceleration of the robot, distance between the robot trajectory and the target, and gender of the participants.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Agnihotri A, Chan A, Hedaoo S, Knight H (2020) Distinguishing robot personality from motion. In: Companion of the 2020 ACM/IEEE international conference on human–robot interaction, pp 87–89

  2. Baraka K, Rosenthal S, Veloso M (2016) Enhancing human understanding of a mobile robot’s state and actions using expressive lights. In: 2016 25th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 652–657

  3. Bartneck C, Kulić D, Croft E, Zoghbi S (2009) Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int J Soc Robot 1(1):71–81

    Article  Google Scholar 

  4. Bartneck C, Suzuki T, Kanda T, Nomura T (2007) The influence of people’s culture and prior experiences with Aibo on their attitude towards robots. AI Soc 21(1–2):217–230

    Google Scholar 

  5. Bröhl C, Nelles J, Brandl C, Mertens A, Nitsch V (2019) Human-robot collaboration acceptance model: development and comparison for Germany, Japan, China and the USA. Int J Soc Robot 11(5):709–726

    Article  Google Scholar 

  6. Carpinella CM, Wyman AB, Perez MA, Stroessner SJ (2017) The robotic social attributes scale (RoSAS) development and validation. In: Proceedings of the 2017 ACM/IEEE international conference on human-robot interaction, pp 254–262

  7. Dautenhahn K, Walters M, Woods S, Koay KL, Nehaniv CL, Sisbot A, Alami R, Siméon T (2006) How may I serve you? A robot companion approaching a seated person in a helping context. In: Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction, pp 172–179

  8. De Graaf M, Allouch SB, Van Diik J (2017) Why do they refuse to use my robot?: Reasons for non-use derived from a long-term home study. In: 2017 12th ACM/IEEE international conference on human-robot interaction (HRI), pp 224–233

  9. Fink J, Bauwens V, Kaplan F, Dillenbourg P (2013) Living with a vacuum cleaning robot. Int J Soc Robot 5(3):389–408

    Article  Google Scholar 

  10. Gallimore D, Lyons JB, Vo T, Mahoney S, Wynne KT (2019) Trusting Robocop: Gender-based effects on trust of an autonomous robot. Front Psychol 10:482

    Article  Google Scholar 

  11. Heider F, Simmel M (1944) An experimental study of apparent behavior. Am J Psychol 57(2):243–259

    Article  Google Scholar 

  12. Hemphill M (1996) A note on adults’ color-emotion associations. J Genet Psychol 157(3):275–280

    Article  Google Scholar 

  13. Hiroi Y, Ito A (2008) Are bigger robots scary?—The relationship between robot size and psychological threat. In: 2008 IEEE/ASME international conference on advanced intelligent mechatronics, pp 546–551

  14. Horstmann AC, Krämer NC (2019) Great expectations? Relation of previous experiences with social robots in real life or in the media and expectancies based on qualitative and quantitative assessment. Front Psychol 10:939

    Article  Google Scholar 

  15. Knight H, Simmons R (2014) Expressive motion with x, y and theta: Laban effort features for mobile robots. In: The 23rd IEEE international symposium on robot and human interactive communication, pp 267–273

  16. Koay KL, Syrdal DS, Walters ML, Dautenhahn K (2007) Living with robots: investigating the habituation effect in participants’ preferences during a longitudinal human-robot interaction study. In: RO-MAN 2007-The 16th IEEE international symposium on robot and human interactive communication, pp 564–569

  17. Leite I, Martinho C, Paiva A (2013) Social robots for long-term interaction: a survey. Int J Soc Robot 5(2):291–308

    Article  Google Scholar 

  18. Li J, Cuadra A, Mok B, Reeves B, Kaye J, Ju W (2019) Communicating dominance in a nonanthropomorphic robot using locomotion. ACM Trans Human-Robot Interact 8(1):1–14

    Article  Google Scholar 

  19. Meier BP, Robinson MD, Clore GL (2004) Why good guys wear white: Automatic inferences about stimulus valence based on brightness. Psychol Sci 15(2):82–87

    Article  Google Scholar 

  20. Naneva S, Sarda Gou M, Webb TL, Prescott TJ (2020) A systematic review of attitudes, anxiety, acceptance, and trust towards social robots. Int J Soc Robot 12(6):1179–1201

    Article  Google Scholar 

  21. Nunnally JC (1994) Psychometric theory. McGraw-Hill, New York

    Google Scholar 

  22. Obaid M, Sandoval EB, Złotowski J, Moltchanova E, Basedow CA, Bartneck C (2016) Stop! That is close enough. How body postures influence human-robot proximity. In: 2016 25th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 354–361

  23. Osgood CE, Suci GJ, Tannenbaum PH (1957) The measurement of meaning, vol 47. University of Illinois Press, Champaign

    Google Scholar 

  24. Papenmeier F, Uhrig M, Kirsch A (2019) Human understanding of robot motion: the role of velocity and orientation. Int J Soc Robot 11(1):75–88

    Article  Google Scholar 

  25. Pörtner A, Schröder L, Rasch R, Sprute D, Hoffmann M, König M (2018) The power of color: A study on the effective use of colored light in human–robot interaction. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 3395–3402

  26. Rossi S, Staffa M, Bove L, Capasso R, Ercolano G (2017) User’s personality and activity influence on hri comfortable distances. In: International conference on social robotics, pp 167–177

  27. Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161

    Article  Google Scholar 

  28. Saerbeck M, Bartneck C (2010) Perception of affect elicited by robot motion. In: Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction, pp 53–60

  29. Schermerhorn P, Scheutz M, Crowell CR (2008) Robot social presence and gender: Do females view robots differently than males? In: Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction, pp 263–270

  30. Schulz T, Holthaus P, Amirabdollahian F, Koay KL, Torresen J, Herstad J (2019)Differences of human perceptions of a robot moving using linear or slow in, slow out velocity profiles when performing a cleaning task. In: 2019 28th IEEE international conference on robot and human interactive communication (RO-MAN), pp 1–8

  31. Singh A, Young JE (2013) A dog tail for utility robots: exploring affective properties of tail movement. In: IFIP conference on human-computer interaction, pp 403–419. Springer, Berlin

  32. Song S, Yamada S (2018) Designing expressive lights and in-situ motions for robots to express emotions. In: Proceedings of the 6th international conference on human-agent interaction, pp 222–228

  33. Song S, Yamada S (2018) Effect of expressive lights on human perception and interpretation of functional robot. In: Extended abstracts of the 2018 CHI conference on human factors in computing systems, pp 1–6

  34. Sung J, Grinter RE, Christensen HI (2010) Domestic robot ecology. Int J Soc Robot 2(4):417–429

    Article  Google Scholar 

  35. Sutton TM, Altarriba J (2016) Color associations to emotion and emotion-laden words: a collection of norms for stimulus construction and selection. Behav Res Methods 48(2):686–728

    Article  Google Scholar 

  36. Syrdal DS, Koay KL, Gácsi M, Walters ML, Dautenhahn K (2010) Video prototyping of dog-inspired non-verbal affective communication for an appearance constrained robot. In: 19th international symposium in robot and human interactive communication, pp 632–637

  37. Tremoulet PD, Feldman J (2000) Perception of animacy from the motion of a single object. Perception 29(8):943–951

    Article  Google Scholar 

  38. Turja T, Oksanen A (2019) Robot acceptance at work: a multilevel analysis based on 27 EU countries. Int J Soc Robot 11(4):679–689

    Article  Google Scholar 

  39. Venkatesh V (2000) Determinants of perceived ease of use: integrating control, intrinsic motivation, and emotion into the technology acceptance model. Inf Syst Res 11(4):342–365

    Article  Google Scholar 

  40. Walters ML, Dautenhahn K, Te Boekhorst R, Koay KL, Kaouri C, Woods S, Nehaniv C, Lee D, Werry I (2005) The influence of subjects’ personality traits on personal spatial zones in a human–robot interaction experiment. In: IEEE international workshop on robot and human interactive communication (RO-MAN) 2005, pp 347–352

  41. Walters ML, Dautenhahn K, Woods SN, Koay KL (2007) Robotic etiquette: results from user studies involving a fetch and carry task. In: 2007 2nd ACM/IEEE international conference on human–robot interaction (HRI), pp 317–324

  42. Walters ML, Oskoei MA, Syrdal DS, Dautenhahn K (2011) A long-term human-robot proxemic study. In: 2011 RO-MAN, pp 137–142

  43. Yoshioka G, Sakamoto T, Takeuchi Y (2015) Inferring affective states from observation of a robot’s simple movements. In: 2015 24th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 185–190

  44. Young JE, Hawkins R, Sharlin E, Igarashi T (2009) Toward acceptable domestic robots: applying insights from social psychology. Int J Soc Robot 1(1):95–108

    Article  Google Scholar 

  45. Zhou A, Hadfield-Menell D, Naaabandi A, Dragan AD (2017) Expressive robot motion timing. In: 2017 12th ACM/IEEE international conference on human–robot interaction (HRI), pp 22–31

Download references

Acknowledgements

This study was supported by Samsung Research, Samsung Electronics Co., Ltd.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Joo-Ho Lee.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yamazoe, H., Chun, J., Kim, Y. et al. Analysis of impressions of robot by changing its motion and trajectory parameters for designing parameterized behaviors of home-service robots. Intel Serv Robotics 16, 3–18 (2023). https://doi.org/10.1007/s11370-022-00447-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11370-022-00447-1

Keywords

Navigation