Skip to main content
Log in

Realization of sign language motion using a dual-arm/hand humanoid robot

  • Original Research Paper
  • Published:
Intelligent Service Robotics Aims and scope Submit manuscript

Abstract

The recent increase in technological maturity has empowered robots to assist humans and provide daily services. Voice command usually appears as a popular human–machine interface for communication. Unfortunately, deaf people cannot exchange information from robots through vocal modalities. To interact with deaf people effectively and intuitively, it is desired that robots, especially humanoids, have manual communication skills, such as performing sign languages. Without ad hoc programming to generate a particular sign language motion, we present an imitation system to teach the humanoid robot performing sign languages by directly replicating observed demonstration. The system symbolically encodes the information of human hand–arm motion from low-cost depth sensors as a skeleton motion time-series that serves to generate initial robot movement by means of perception-to-action mapping. To tackle the body correspondence problem, the virtual impedance control approach is adopted to smoothly follow the initial movement, while preventing potential risks due to the difference in the physical properties between the human and the robot, such as joint limit and self-collision. In addition, the integration of the leg-joints stabilizer provides better balance of the whole robot. Finally, our developed humanoid robot, NINO, successfully learned by imitation from human demonstration to introduce itself using Taiwanese Sign Language.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Zafrulla Z, Brashear H, Starner T, Hamilton H, Presti P (2011) American sign language recognition with the kinect. In: Proceedings of the 13th international conference on multimodal interfaces, pp 279–286

  2. Lang S, Block M, Rojas R (2012) Sign language recognition using kinect. In: Rutkowski L, Korytkowski M, Scherer R, Tadeusiewicz R, Zadeh LA, Zurada JM (eds) Artificial intelligence and soft computing. Springer, Berlin, pp 394–402

  3. Kin Fun L, Lothrop K, Gill E, Lau S (2011) A web-based sign language translator using 3D video processing. In: 14th international conference on network-based information systems (NBiS), pp 356–361

  4. Yi L (2012) Hand gesture recognition using Kinect. In: IEEE 3rd international conference on software engineering and service science (ICSESS), pp 196–199

  5. Potter LE, Araullo J, Carter L (2013) The leap motion controller: a view on sign language. In: Proceedings of the 25th Australian computer–human interaction conference: augmentation, application, innovation, collaboration, pp 175–178

  6. Soltani F, Eskandari F, Golestan S (2012) Developing a gesture-based game for deaf/mute people using microsoft kinect. In: Sixth international conference on complex, intelligent and software intensive systems (CISIS), pp 491–495

  7. Uluer P, Akalın N, Köse H (2015) A new robotic platform for sign language tutoring. Int J Soc Robot 7:1–15

  8. Köse H, Uluer P, Akalın N, Yorgancı R, Özkul A, Ince G (2015) The effect of embodiment in sign language tutoring with assistive humanoid robots. Int J Soc Robot 7:1–12

  9. Koay KL, Lakatos G, Syrdal DS, Gacsi M, Bereczky B, Dautenhahn K, Miklosi A, Walters ML (2013) Hey! there is someone at your door. A hearing robot using visual communication signals of hearing dogs to communicate intent. In: IEEE symposium on artificial life (ALIFE), pp 90–97

  10. Beck A, Cañamero L, Hiolle A, Damiano L, Cosi P, Tesser F, Sommavilla G (2013) Interpretation of emotional body language displayed by a humanoid robot: a case study with children. Int J Soc Robot 5(3):325–334

    Article  Google Scholar 

  11. Whiten A, Horner V, Litchfield CA, Marshall-Pescini S (2004) How do apes ape? Anim Learn Behav 32(1):36–52

    Article  Google Scholar 

  12. Lopes M, Melo F, Montesano L, Santos-Victor J (2010) Abstraction levels for robotic imitation: overview and computational approaches. In: Sigaud O, Peters J (eds) From motor learning to interaction learning in robots. Springer, Berlin, pp 313–355

  13. Sciutti A, Bisio A, Nori F, Metta G, Fadiga L, Pozzo T, Sandini G (2012) Measuring human–robot interaction through motor resonance. Int J Soc Robot 4(3):223–234

    Article  Google Scholar 

  14. Bisio A, Sciutti A, Nori F, Metta G, Fadiga L, Sandini G, Pozzo T (2014) Motor contagion during human–human and human–robot interaction. PLoS One 9(8):e106172

  15. Hogeveen J, Obhi SS (2012) Social interaction enhances motor resonance for observed human actions. J Neurosci 32(17):5984–5989

    Article  Google Scholar 

  16. Uithol S, van Rooij I, Bekkering H, Haselager P (2011) Understanding motor resonance. Soc Neurosci 6(4):388–397

    Article  Google Scholar 

  17. Alissandrakis A, Nehaniv CL, Dautenhahn K (2006) Action, state and effect metrics for robot imitation. In: IEEE International symposium on robot and human interactive communication, pp 232–237

  18. Alissandrakis A, Nehaniv CL, Dautenhahn K (2007) Correspondence mapping induced state and action metrics for robotic imitation. IEEE Trans Syst Man and Cybern Part B Cybern 37(2):299–307

    Article  Google Scholar 

  19. Pollard NS, Hodgins JK, Riley MJ, Atkeson CG (2002) Adapting human motion for the control of a humanoid robot. In: IEEE international conference on robotics and automation, pp 1390–1397

  20. Kim C, Kim D, Oh Y (2005) Solving an inverse kinematics problem for a humanoid robot imitation of human motions using optimization. In: Proceedings of the international conference on infomatics in control, automation and robotics, pp 85–92

  21. Nakaoka S, Nakazawa A, Yokoi K, Hirukawa H, Ikeuchi K (2003) Generating whole body motions for a biped humanoid robot from captured human dances. In: IEEE international conference on robotics and automation, pp 3905–3910

  22. Nakaoka S, Nakazawa A, Yokoi K, Ikeuchi K (2004) Leg motion primitives for a humanoid robot to imitate human dances. J Three Dimens Images 18(1):73–78

    Google Scholar 

  23. Choi Y, Ra S, Kim S, Park S-K (2009) Real-time arm motion imitation for human–robot tangible interface. Intell Serv Robot 2(2):61–69

    Article  Google Scholar 

  24. Ou Y, Hu J, Wang Z, Fu Y, Wu X, Li X (2015) A real-time human imitation system using kinect. Int J Soc Robot 7:1–14

  25. Calinon S, D’halluin F, Sauser EL, Caldwell DG, Billard AG (2010) Learning and reproduction of gestures by imitation. IEEE Robot Autom Mag 17(2):44–54

    Article  Google Scholar 

  26. Hung-Yi L, Han-Pang H, Huan-Kun H (2014) Lifting motion planning for humanoid robots. In: IEEE international conference on automation science and engineering, pp 1174–1179

  27. Zhengyou Z (2012) Microsoft kinect sensor and its effect. IEEE MultiMed 19(2):4–10

    Article  Google Scholar 

  28. Weichert F, Bachmann D, Rudak B, Fisseler D (2013) Analysis of the accuracy and robustness of the leap motion controller. Sensors 13(5):6380–6393

    Article  Google Scholar 

  29. Kim S, Kim C, Park JH (2006) Human-like arm motion generation for humanoid robots using motion capture database. In: IEEE/RSJ international conference on intelligent robots and systems, pp 3486–3491

  30. Lo S-Y, Cheng C-A, Huang H-P (2016) Virtual impedance control for safe human–robot interaction. J Intell Robot Syst 82(1):3–19

    Article  Google Scholar 

  31. Choi Y, Kim D, You B-J (2006) On the walking control for humanoid robot based on the kinematic resolution of com jacobian with embedded motion. In: IEEE international conference on robotics and automation, pp 2655–2660

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Han-Pang Huang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lo, SY., Huang, HP. Realization of sign language motion using a dual-arm/hand humanoid robot. Intel Serv Robotics 9, 333–345 (2016). https://doi.org/10.1007/s11370-016-0203-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11370-016-0203-8

Keywords

Navigation