Skip to main content
Log in

Development of a Socially Interactive System with Whole-Body Movements for BHR-4

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

For a long time, humans have been communicating with others through voice, facial expressions, and body movements. If a humanoid robot has a human-habitual, natural, and human-like interactive form, it tends to be accepted by humans. To date, the majority of the existing humanoid robots have had difficulty in interacting with humans in a human-like way. This study focuses on this issue and develops a socially interactive system for enhancing the natural communication ability of a humanoid robot. The system, which is implemented in an android robot, BHR-4, features hearing, voice conversation, and facial and body emotional expression capabilities. Then, a full-body social motion planner for a humanoid robot is presented. The objective of this planner is to control the whole-body motion of the robot in a way similar to that of humans. Finally, experiments are conducted on the robot regarding its interactions with humans in a pure indoor environment. It is expected that the socially interactive system can enhance the natural communication ability of an android robot. The results of the experiments show that the combination of verbal behavior with facial expressions and body movements is better than verbal behavior alone, verbal behavior combined with facial expressions, or verbal behavior combined with body movements.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Burgard W, Cremers AB, Fox D, Hahnel D, Lakemeyer G, Schulz D, Steiner W, Thrun S (1998) The interactive museum tour-guide robot. In: Proceedings of the 5th national conference on AAAI, pp 11–18

  2. Thrun S, Bennewitz M, Burgard W, Cremers AB, Dellaert F, Fox D, Hahnel D, Rosenberg C, Roy N, Schulte J, Schulz D (1999) MINERVA: a second-generation museum tour-guide robot. In: Proceedings of the IEEE international conference on robotics and automation, pp 1999–2005

  3. Shiomi M, Kanda T, Ishiguro H, Hagita N (2006) Interactive humanoid robots for a science museum. In: Proceedings of the 1st ACM SIGCHI/SIGART conference on human–robot interaction, pp 305–312

  4. Kanda T, Shiomi M, Miyashita Z, Ishiguro H, Hagita N (2009) An affective guide robot in a shopping mall. In: Proceedings of the 4th ACM/IEEE international conference on human–robot interaction, pp 173–180

  5. Kanda T, Shiomi M, Miyashita Z, Ishiguro H, Hagita N (2010) A communication robot in a shopping mall. IEEE Trans Robot 26(5):897–913

    Article  Google Scholar 

  6. Kanda T et al (2004) Interactive robots as social partners and peer tutors for children: a field trial. Hum Comput Interact 19(1):61–84

    Article  MathSciNet  Google Scholar 

  7. Gockley R, Forlizzi J, Simmons R (2006) Interactions with a moody robot. HRI, Sheffield

    Book  Google Scholar 

  8. Ahn HS (2014) Designing of a personality based emotional decision model for generating various emotional behavior of social robots. Adv Hum Comput Interact 2014:1–14. doi:10.1155/2014/630808

    Article  Google Scholar 

  9. Kozima H, Nakagawa C, Yasuda Y (2005) Interactive robots for communication-care: a case-study in autism therapy. In: International workshop on Ro-Man, pp 341–346

  10. Tanaka F (2007) Socialization between toddlers and robots at an early childhood education center. Proc Natl Acad Sci USA 104(46):17954–17958

    Article  Google Scholar 

  11. Wada K (2004) Effects of robot-assisted activity for elderly people and nurses at a day service center. Proc IEEE 92(11):1780–1788

    Article  Google Scholar 

  12. Mehrabian A (1968) Communication without words. Psychol Today 2:53–55

    Google Scholar 

  13. Sakamoto D, Kanda T, Ono T (2004) Cooperative embodied communication emerged by interactive humanoid robots. Int J Hum Comput Stud 62:247–265

    Article  Google Scholar 

  14. Fukuda T, Tachibana D, Arai F, Taguri J, Nakashima M, Hasegawa Y (2001) Human–robot mutual communication system. In: Proceedings of the IEEE International workshop on robot and human interactive communication. Paris, pp 14–19

  15. Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum Comput Stud 59:119–155

    Article  Google Scholar 

  16. Breemen AV (2004) Bringing robots to life: applying principles of animation to robots. In: Workshop on shaping human-robot interaction-understanding the social aspects of intelligent robotic products, Vienna

  17. Bartneck C, Reichenbach J, Breemen A (2004) In your face, robot! The influence of a character’s embodiment on how users perceive its emotional expressions. In: Proceedings of the design and emotion conference. Ankara, pp 32–51

  18. Zecca M, Mizoguchi Y, Endo K, Iida F, Kawabata Y, Endo N, Itoh K, Takanishi A (2009) Whole body emotion expressions for KOBIAN humanoid robot-preliminary experiments with different emotional patterns. In: IEEE international symposium on robot and human interactive communication, pp 381–386

  19. He H, Ge SS, Zhang Z (2013) A saliency-driven robotic head with bio-inspired saccadic behaviors for social robotics. Auton Robots 36(3):225–240

    Article  Google Scholar 

  20. He H, Ge SS, Zhang Z (2011) Visual attention prediction using saliency determination of scene understanding for social robots. Int J Soc Robot 3(4):457–468

    Article  MathSciNet  Google Scholar 

  21. Destephe M, Henning A, Zecca M, Hashimoto K, Takanishi A (2013) Perception of emotion and emotional intensity in humanoid robots gait. In: Proceedings of the IEEE international conference on robotics and biomimetics, Shenzhen, pp 1276–1281

  22. Cosentino S, Kishi T, Zecca M, Sessa S, Bartolomeo L, Hashimoto K, Nozawa T, Takanishi A (2013) Human-humanoid robot social interaction: laughter. In: Proceedings of the IEEE international conference on robotics and biomimetics, Shenzhen, pp 1396–1401

  23. Tasaki T, Ogata TG, Okuno H (2014) The interaction between a robot and multiple people based on spatially mapping of friendliness and motion parameters. Adv Robot 28:39–51

    Article  Google Scholar 

  24. Hashimoto T, Kato N, Kobayashi H (2011) Development of educational system with the android robot SAYA and evaluation. Int J Adv Robot Syst 8:51–61

    Google Scholar 

  25. Hara F, Akazawa H, Kobayashi H (2001) Realistic facial expressions by SMA driven face robot. In: Proceedings of the IEEE international workshop on robot and human communication, Paris, pp 504–511

  26. Ge SS, He H, Zhang Z (2011) Bottom-up saliency detection for attention determination. Mach Vis Appl 24(1):103–116

    Article  Google Scholar 

  27. Oh JH, Hanson D, Kim WS, Han IY, Kim JY, Park IW (2006) Design of android type humanoid robot Albert HUBO. In: Proceedings of the IEEE/RSJ international conference on intelligent robots and systems, Beijing, pp 1428–1433

  28. Nishio S, Ishiguro H, Hagita N (2007) Geminoid: teleoperated android of an existing person. INTECH Open Access Publisher, Vienna, pp 343–352

    Google Scholar 

  29. Sakamoto D, Kanda T, Ono T, Ishiguro H, Hagita N (2007) Android as a telecommunication medium with a human-like presence. In: Proceedings of 2nd ACM/IEEE international conference on human-robot interaction, Washington, pp 193–200

  30. Becker-Asano C, Ishiguro H (2011) Intercultural differences in decoding facial expressions of the android robot Geminoid F. J Artif Intell Soft Comput Res 1:215–231

    Google Scholar 

  31. Lin C, Huang H (2009) Design of a face robot with facial expression. In: Proceedings of the IEEE international conference on robotics and biomimetics, Guilin, pp 492–497

  32. Kaneko K, Kanehiro F, Morisawa M (2011) Hardware improvement of cybernetic human HRP-4C for entertainment use. In: Proceedings of the IEEE international conference on intelligent robots and systems, San Francisco, pp 4392–4399

  33. Vlachos E, Scharfe H (2015) Towards designing android faces after actual humans. In: Proceeds of 9th KES international conference, Sorrento, pp 109–119

  34. Mark L, Knapp A (2009) Hall Nonverbal Commun Hum Interact. Belmont, California

    Google Scholar 

  35. Gao J, Huang Q, Yu Z (2011) Design of the facial expression mechanism for humanoid robots. In: 18th CISM-IFToMM symposium on robot design, dynamics and control, Udine, pp 433–440

  36. Ma G, Huang Q, Yu Z (2014) Experiments of a human–robot social interactive system with whole-body movements. In: Proceedings of ROMANSY 2014 XX CISM-IFToMM symposium on theory and practice of robots and manipulators, Moscow, pp 501–508

  37. Yu Z, Ma G, Huang Q (2014) Modeling and design of a humanoid robotic face based on an active drive points model. Adv Robot 28:379–388

    Article  Google Scholar 

  38. Ekman P, Friesen WV (1978) The facial action coding system. Consulting Psychologists Press, Sunnyvale

    Google Scholar 

  39. Ekman P, Rosenberg EL (1997) What the face reveals: basic and applied studies of spontaneous expression using the facial action coding system (FACS). Oxford University Press, New York

    Google Scholar 

  40. Microsoft Speech Programming Guide. https://msdn.microsoft.com/en-us/library/hh378466.aspx. Accessed 11 November 2015

  41. Ma Y, Paterson HM, Pollick FE (2006) A motion capture library for the study of identity, gender, and emotion perception from biological motion. Behav Res Methods 38(1):134–141

    Article  Google Scholar 

  42. CMU Graphics Lab Motion Capture Database. http://mocap.cs.cmu.edu. Accessed 11 November 2015

  43. Muller M, Roder T, Clausen M (2007) Documentation mocap database HDM05. The University of Bonn Computer Graphics Technical Reports, CG-2007-2, Bonn

  44. Guerra-Filho G, Biswas A (2012) The human motion database: a cognitive and parametric sampling of human motion. Image Vis Comput 30(3):251–261

    Article  Google Scholar 

  45. Mocapdata.com. http://www.mocapdata.com. Accessed 11 November 2015

  46. Kuehne H, Jhuang H, Stiefelhagen R, Serre T (2013) Hmdb51: a large video database for human motion recognition. In: High performance computing in science and engineering 12. Springer, pp 571–582

  47. Mandery C, Terlemez O, Do M (2015) The KIT whole-body human motion database. In: IEEE international conference on robotics and automation, Seattle, pp 329–336

  48. Huang Q, Yu Z, Zhang W (2010) Design and similarity evaluation on humanoid motion based on human motion capture. Robotica 28:737–745

    Article  Google Scholar 

  49. Vukobratovic M, Borovac B (2004) Zero-moment point-thirty five years of its life. Int J Humanoid Robot 1:157–173

    Article  Google Scholar 

  50. Moos FA, Hunt KT, Omwake KT (1927) Social intelligence test. George Washington University, Washington

    Google Scholar 

  51. Gough HG (1968) Chapin social insight test manual Palo Alto. Consulting Psychologists Press, Palo Alto

    Google Scholar 

  52. Banham KM (1968) Social competence inventory for adults: a social competence inventory for older persons. Family Life Publications, Durham

    Google Scholar 

  53. Heerink M, Krose B, Evers V (2010) Assessing acceptance of assistive social agent technology by older adults: the Almere model. Int J Soc Robot 2(4):361–375

    Article  Google Scholar 

  54. Saini P, Ruyter B, Markopoulos P (2005) Assessing the effects of building social intelligence in a robotic interface for the home. Int J Soc Robot 17(5):522–541

    Google Scholar 

Download references

Acknowledgments

This work was supported in part by the National Natural Science Foundation of China under grants 61320106012, 61375103, 61533004, 61273348, 61175077, and 61321002, in part by the 863 Program of China under  Grant  2014AA041602, Grant 2015AA042305, and Grant 2015AA043202, in part by the Key Technologies Research and Development Program under Grant 2015BAF13B01 and Grant 2015BAK35B01, in part by the Beijing Natural Science Foundation under Grant 4154084, and in part by the “111” Project under Grant B08043.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Junyao Gao.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ma, G., Gao, J., Yu, Z. et al. Development of a Socially Interactive System with Whole-Body Movements for BHR-4. Int J of Soc Robotics 8, 183–192 (2016). https://doi.org/10.1007/s12369-015-0330-y

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-015-0330-y

Keywords

Navigation