Skip to main content
Log in

Guidelines for Contextual Motion Design of a Humanoid Robot

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

The motion of a humanoid robot is one of the most intuitive communication channels for human-robot interaction. Previous studies have presented related knowledge to generate speech-based motions of virtual agents on screens. However, physical humanoid robots share time and space with people, and thus, numerous speechless situations occur where the robot cannot be hidden from users. Therefore, we must understand the appropriate roles of motion design for a humanoid robot in many different situations. We achieved the target knowledge as motion-design guidelines based on the iterative findings of design case studies and a literature review. The guidelines are largely separated into two main roles for speech-based and speechless situations, and the latter can be further subdivided into idling, observing, listening, expecting, and mood-setting, all of which are well-distributed by different levels of intension. A series of experiments proved that our guidelines help create preferable motion designs of a humanoid robot. This study provides researchers with a balanced perspective between speech-based and speechless situations, and thus they can design the motions of a humanoid robot to satisfy users in more acceptable and pleasurable ways.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Argyle M (1975) Bodily communication. Methuen, London

    Google Scholar 

  2. Breazeal C (2004) Social interactions in HRI: the robot view. IEEE Trans Syst Man Cybern 34(2):181–186

    Article  Google Scholar 

  3. Breazeal C, Scassellati B (2000) Infant-like social interactions between a robot and a human caregiver. Adapt Behav 8(1):49–74

    Article  Google Scholar 

  4. Breazeal C, Kidd D, Thomaz A, Hoffman G, Berlin M (2005) Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In: Proc IROS’05, pp 708–713

    Google Scholar 

  5. Breemen V (2004) Bringing robots to life: applying principles of animation to robots. In: Proc shaping human-robot interaction workshop, CHI’04

    Google Scholar 

  6. Brooks A, Arkin R (2006) Behavioral overlays for non-verbal communication expression on a humanoid robot. Auton Robots 22(1):55–74

    Article  Google Scholar 

  7. Boudreau D (2002) Design 4U. ASU research magazine, 2002 summer: 40–43

  8. Cassell J, Pelachaud C, Badler N, Steedman M (1994) Animated conversation: rule-based generation of facial expression, gesture & spoken intonation for multiple conversational agents. In: Proc the 21st annual conference on computer graphics and interactive techniques, pp 413–420

    Google Scholar 

  9. Cassell J, Vilhj’almsson H, Bickmore T (2001) BEAT: the behavior expression animation toolkit. In: Proc SIGGRAPH’01, pp 477–486

    Google Scholar 

  10. Duffy B (2003) Anthropomorphism and the social robot. Robot Auton Syst 42(3–4):177–190

    Article  MATH  Google Scholar 

  11. Egges A, Molet T, Magnenat-Thalmann N (2004) Personalized real-time idle motion synthesis. In: Proc the 12th pacific graphics conference, pp 121–130

    Google Scholar 

  12. Ellenberg R, Grunberg D, Kim Y, Oh P (2009) Creating an autonomous dancing robot. In: Proc international conference on hybrid information Technology’09, pp 222–227

    Google Scholar 

  13. Foner L (1997) What’s an agent, anyway? A sociological case study. Agents memo 93-01. Agents group. MIT Media Lab, Cambridge, MA, p 199

  14. Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42(3–4):143–166

    Article  MATH  Google Scholar 

  15. Goffman E (1963) Behavior in public space. Free Press, New York

    Google Scholar 

  16. Hall E (1966) The hidden dimension. Garden City, Doubleday

    Google Scholar 

  17. Hasanuzzaman T, Zhang V, Ampornaramveth H, Ueno H (2006) Gesture-based human-robot interaction using a knowledge-based software platform. Ind Robot, Int J 33(1):37–49

    Article  Google Scholar 

  18. Imai M, Ono T, Ishiguro H (2003) Physical relation and expression: joint attention for human-robot interaction. IEEE Trans Ind Electron 50(4):636–643

    Article  Google Scholar 

  19. Kendon A (1988) Sign languages of aboriginal Australia: cultural, semiotic and communicative perspectives. Cambridge University Press, Cambridge

    Google Scholar 

  20. Kipp M, Neff M, Kipp K, Albrecht I (2007) Towards natural gesture synthesis: evaluating gesture units in a data-driven approach to gesture synthesis. In: Proc international conference of intelligent virtual agents, pp 15–28

    Google Scholar 

  21. Knapp M, Hall J (1997) Nonverbal communication in human interaction, 4th edn. Harcourt Brace College Publishers, Fort Worth

    Google Scholar 

  22. Kopp S, Wachsmuth I (2000) A knowledge-based approach for lifelike gesture animation. A knowledge-based approach for lifelike gesture animation. In: Proc European conference on artificial intelligence, pp 663–667

    Google Scholar 

  23. Kozima H, Vatikiotis-Bateson E (2001) Communicative criteria for processing time/space-varying information. In: Proc IEEE international workshop on robot and human communication, pp 377–382

    Google Scholar 

  24. Lee J, Marsella S (2006) Nonverbal behavior generator for embodied conversational agents. Intell Virtual Agents 4133:243–255

    Article  Google Scholar 

  25. Lee J, Resnick M (2009) Reading with robots: engaging children to read aloud to robots. Technical report, MIT, School of Media Arts and Sciences

  26. Lee J, DeVault D, Marsella S, Traum D (2008) Thoughts on FML: behavior generation in the virtual human communication architecture. Presented at the first functional markup language workshop, AAMAS’08

  27. Machotka P, Spiegel J (1982) The articulate body. Irvington Publishers, New York

    Google Scholar 

  28. Mali A (2002) On the evaluation of agent behaviors. Technical report, Department of Electrical Engineering and Computer Science, Wisconsin Univ, Milwaukee

  29. Matsusaka Y (2008) History and current researches on building a human interface for humanoid robots. Model Commun Robots Virtual Hum 4930:109–124

    Article  Google Scholar 

  30. Moore C, Dunham P (eds) (1995) Joint attention: its origins and role in development. Erlbaum, Hillsdale

    Google Scholar 

  31. Nakadai K, Hidai K, Mizoguchi H, Okuno H, Kitano H (2001) Real-time auditory and visual multiple-object tracking for robots. In: Proc the international joint conference on artificial intelligence, pp 1425–1432

    Google Scholar 

  32. Nehaniv C (2005) Classifying types of gesture and inferring intent. In: Proc AISB’05 symposium on robot companions, pp 74–81

    Google Scholar 

  33. Ogasawara Y, Okamoto M, Nakano Y, Nishida T (2005) Establishing natural communication environment between a human and a listener robot. In: Proc AISB symposium on conversational informatics, pp 42–51

    Google Scholar 

  34. Pelachaud C (2005) Multimodal expressive embodied conversational agents. In: Proc the 13th annual ACM international conference on multimedia, pp 683–689

    Chapter  Google Scholar 

  35. Poyatos F (1976) Man beyond words. Theory and methodology of nonverbal communication. New York State English Council, Oswego

    Google Scholar 

  36. Ribeiro T, Paiva A (2012) The illusion of robotic life: principles and practices of animation for robots. In: Proc HRI’12, pp 383–390

    Google Scholar 

  37. Saerbeck M, Breemen V (2007) Design guidelines and tools for creating believable motion for personal robots. In: Proc the 16th IEEE international symposium on robot and human interactive communication, pp 386–391

    Google Scholar 

  38. Sakamoto D, Kanda T, Ono T, Kamachima M, Imai M, Ishiguro H (2005) Cooperative embodied communication emerged by interactive humanoid robots. Int J Hum-Comput Stud 62:247–265

    Article  Google Scholar 

  39. Salvini P, Laschi C, Dario P (2010) Design for acceptability: improving robots’ coexistence in human society. Int J Soc Robot 2:451–460. doi:10.1007/s12369-010-0079-2

    Article  Google Scholar 

  40. Suzuki K, Hikiji R, Hashimoto S (2002) Development of an autonomous humanoid robot, iSHA, for harmonized human-machine environment. J Robot Mechatron 14(5):497–505

    Google Scholar 

  41. Thomas F, Johnson O (1981) The illusion of life. Disney animation, Walt Disney productions

  42. Young J et al (2011) Evaluating human-robot interaction focusing on the holistic interaction experience. Int J Soc Robot 3:53–67. doi:10.1007/s12369-010-0081-8

    Article  Google Scholar 

  43. Wolf T, Rode J, Sussman J, Kellogg W (2006) Dispelling design as the ‘Black Art’ of CHI. In: Proc the SIGCHI conference on human factors in computing systems, pp 521–530

    Google Scholar 

Download references

Acknowledgements

We thank Fumitaka Yamaoka, Takamasa Iio, Yusuke Okuno, and Shiga Miwa for their extensive help in making the scenario movies, and we also appreciate the technical assistance of Shunsuke Yoshida and Kazuhiko Shinozawa of ATR’s IRC Lab.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jinyung Jung.

Additional information

This work was supported by the Internship Program of Brain Korea 21 and in part supported by a Grant-in Aid for Scientific Research in Japan, KAKENHI (21118008).

Rights and permissions

Reprints and permissions

About this article

Cite this article

Jung, J., Kanda, T. & Kim, MS. Guidelines for Contextual Motion Design of a Humanoid Robot. Int J of Soc Robotics 5, 153–169 (2013). https://doi.org/10.1007/s12369-012-0175-6

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-012-0175-6

Keywords

Navigation