Skip to main content

Convolutional Neural Network Applied to the Gesticulation Control of an Interactive Social Robot with Humanoid Aspect

  • Conference paper
  • First Online:
Intelligent Systems and Applications (IntelliSys 2019)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 1038))

Included in the following conference series:

Abstract

This document presents the enforcement of a facial gesture recognition system through applying a convolutional neural network algorithm for gesticulation of an interactive social robot with humanoid appearance, which was designed in order to accomplish the thematic proposed. Furthermore, it is incorporated into it a hearing communication system for Human-Robot interaction throughout the use of visemes, by coordinating the robot’s mouth movement with the processed audio of the text converted to the robot’s voice (text to speech). The precision achieved by the convolutional neural network incorporated in the social-interactive robot is 61%, while the synchronization system between the robot’s mouth and the robot’s audio-voice differs from 0.1 s. In this manner, it is pretended to endow mechanisms social robots for a more naturally interaction with people, thus facilitating the appliance of them in the fields of children’s teaching-learning, medical therapies and as entertainment means.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Barnes, J., FakhrHosseini, M., Jeon, M., Park, C.-H., Howard, A.: The influence of robot design on acceptance of social robots. In: 2017 14th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI), pp. 51–55. IEEE, Jeju (2017)

    Google Scholar 

  2. Mead, R., Mataric, M.J.: Autonomous human-robot proxemics: a robot-centered approach. In: The Eleventh ACM/IEEE International Conference on Human Robot Interaction, p. 573. IEEE Press (2016)

    Google Scholar 

  3. Rubio Benavides, J.A.: Disenño y construcción de un robot interactivo para el tratamiento de personas con el trastorno del espectro autista (TEA), Universidad de las Fuerzas Armadas (ESPE) (2016)

    Google Scholar 

  4. Sojib, N., Islam, S., Rupok, M.H., Hasan, S., Amin, M.R., Iqbal, M.Z.: Design and development of the social humanoid robot named Ribo. In: 2017 IEEE Region 10 Humanitarian Technology Conference (R10-HTC), pp. 314–317. IEEE, Dhaka (2017)

    Google Scholar 

  5. Lapusan, C., Rad, C.-R., Besoiu, S., Plesa, A.: Design of a humanoid robot head for studying human-robot interaction. In: 2015 7th International Conference on Electronics, Computers and Artificial Intelligence (ECAI), pp. WR-15-WR-18. IEEE, Bucharest (2015)

    Google Scholar 

  6. Chen, L., Zhou, M., Su, W., Wu, M., She, J., Hirota, K.: Softmax regression based deep sparse autoencoder network for facial emotion recognition in human-robot interaction. Inf. Sci. 428, 49–61 (2018)

    Article  MathSciNet  Google Scholar 

  7. Faria, D.R., Vieira, M., Faria, F.C.C., Premebida, C.: Affective facial expressions recognition for human-robot interaction. In: 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp. 805–810. IEEE, Lisbon (2017)

    Google Scholar 

  8. Chen, J., Chen, Z., Chi, Z., Fu, H.: Facial expression recognition based on facial components detection and hog features. In: International Workshops on Electrical and Computer Engineering Subfields, pp. 884–888, Istanbul (2014)

    Google Scholar 

  9. Soni, L.N., Datar, A., Datar, S.: Implementation of Viola-Jones algorithm based approach for human face detection. Int. J. Curr. Eng. Technol. 7, 1819–1823 (2017)

    Google Scholar 

  10. Fernández, R., Montes, H. (eds.): RoboCity16 Open Conference on Future Trends in Robotics. Consejo Superior de Investigaciones Cientificas, Madrid (2016)

    Google Scholar 

  11. Cheng, H., Ji, G.: Design and implementation of a low cost 3D printed humanoid robotic platform. In: 2016 IEEE International Conference on Cyber Technology in Automation. Control, and Intelligent Systems (CYBER), pp. 86–91. IEEE, Chengdu (2016)

    Google Scholar 

  12. Le, T.-L., Dong, V.-T.: Toward a Vietnamese facial expression recognition system for human-robot interaction. In: The 2011 International Conference on Advanced Technologies for Communications (ATC 2011), pp. 252–255. IEEE, Da Nang (2011)

    Google Scholar 

  13. Nakaoka, S., Kanehiro, F., Miura, K., Morisawa, M., Fujiwara, K., Kaneko, K., Kajita, S., Hirukawa, H.: Creating facial motions of cybernetic human HRP-4C. In: 2009 9th IEEE-RAS International Conference on Humanoid Robots, pp. 561–567. IEEE, Paris (2009)

    Google Scholar 

  14. Wang, K., Li, R., Zhao, L.: Real-time facial expressions recognition system for service robot based-on ASM and SVMs. In: 2010 8th World Congress on Intelligent Control and Automation, pp. 6637–6641. IEEE, Jinan (2010)

    Google Scholar 

  15. Deng, J., Pang, G., Zhang, Z., Pang, Z., Yang, H., Yang, G.: cGAN based Facial Expression Recognition for Human-Robot Interaction. IEEE Access. 7, 9848–9859 (2019). https://doi.org/10.1109/ACCESS.2019.2891668

    Article  Google Scholar 

  16. LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521, 436 (2015)

    Article  Google Scholar 

  17. Kumar, P., Happy, S.L., Routray, A.: A real-time robust facial expression recognition system using HOG features. In: 2016 International Conference on Computing. Analytics and Security Trends (CAST), pp. 289–293. IEEE, Pune (2016)

    Google Scholar 

  18. Meghdari, A., Shouraki, S.B., Siamy, A., Shariati, A.: The real-time facial imitation by a social humanoid robot. In: 2016 4th International Conference on Robotics and Mechatronics (ICROM), pp. 524–529. IEEE, Tehran (2016)

    Google Scholar 

  19. Fernandez, M.C.D., Gob, K.J.E., Leonidas, A.R.M., Ravara, R.J.J., Bandala, A.A., Dadios, E.P.: Simultaneous face detection and recognition using Viola-Jones Algorithm and Artificial Neural Networks for identity verification. In: 2014 IEEE Region 10 Symposium, pp. 672–676. IEEE, Kuala Lumpur (2014)

    Google Scholar 

  20. Wang, Y.-Q.: An analysis of the Viola-Jones face detection algorithm. Image Process. On Line 4, 128–148 (2014)

    Article  Google Scholar 

  21. Sang, D.V., Van Dat, N., Thuan, D.P.: Facial expression recognition using deep convolutional neural networks. In: 2017 9th International Conference on Knowledge and Systems Engineering (KSE), pp. 130–135. IEEE, Hue (2017)

    Google Scholar 

  22. Ashwin, T.S., Jose, J., Raghu, G., Reddy, G.R.M.: An E-learning system with multifacial emotion recognition using supervised machine learning. In: 2015 IEEE Seventh International Conference on Technology for Education (T4E), pp. 23–26. IEEE, Warangal (2015)

    Google Scholar 

  23. Vu, T.H., Nguyen, L., Guo, T., Monga, V.: Deep network for simultaneous decomposition and classification in UWB-SAR imagery. In: 2018 IEEE Radar Conference (RadarConf18), pp. 0553–0558. IEEE, Oklahoma City (2018)

    Google Scholar 

  24. Khan, S., Rahmani, H., Shah, S.A., Bennamoun, M.: A guide to convolutional neural networks for computer vision. Synth. Lect. Comput. Vis. 8, 1–207 (2018)

    Article  Google Scholar 

  25. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv:1412.6980 [cs]. (2014)

  26. Deng, X., Liu, Q., Deng, Y., Mahadevan, S.: An improved method to construct basic probability assignment based on the confusion matrix for classification problem. Inf. Sci. 340–341, 250–261 (2016)

    Article  Google Scholar 

  27. Liu, L., Li, B., Chen, I.-M., Goh, T.J., Sung, M.: Interactive robots as social partner for communication care. In: 2014 IEEE International Conference on Robotics and Automation (ICRA), pp. 2231–2236. IEEE, Hong Kong (2014)

    Google Scholar 

  28. Encalada, P., Alvarado, B., Matia, F.: Facial expressions and voice control of an interacive robot” Robocity, Chap. 27 (2016)

    Google Scholar 

Download references

Acknowledgment

The authors thank the Technical University of Ambato and the “Dirección de Investigación y Desarrollo” (DIDE) for their support in carrying out this research, in the execution of the project “Plataforma Móvil Omnidireccional KUKA dotada de Inteligencia Artificial utilizando estrategias de Machine Learnig para Navegación Segura en Espacios no Controlados”, project code: PFISEI27.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marcelo V. Garcia .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Arias, E., Encalada, P., Tigre, F., Granizo, C., Gordon, C., Garcia, M.V. (2020). Convolutional Neural Network Applied to the Gesticulation Control of an Interactive Social Robot with Humanoid Aspect. In: Bi, Y., Bhatia, R., Kapoor, S. (eds) Intelligent Systems and Applications. IntelliSys 2019. Advances in Intelligent Systems and Computing, vol 1038. Springer, Cham. https://doi.org/10.1007/978-3-030-29513-4_76

Download citation

Publish with us

Policies and ethics