Skip to main content
Log in

Human–Robot Facial Expression Reciprocal Interaction Platform: Case Studies on Children with Autism

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Reciprocal interaction and facial expression are some of the most interesting topics in the fields of social and cognitive robotics. On the other hand, children with autism show a particular interest toward robots, and facial expression recognition can improve these children’s social interaction abilities in real life. In this research, a robotic platform has been developed for reciprocal interaction consisting of two main phases, namely as Non-structured and Structured interaction modes. In the Non-structured interaction mode, a vision system recognizes the facial expressions of the user through a fuzzy clustering method. The interaction decision-making unit is combined with a fuzzy finite state machine to improve the quality of human–robot interaction by utilizing the results obtained from the facial expression analysis. In the Structured interaction mode, a set of imitation scenarios with eight different posed facial behaviors were designed for the robot. As a pilot study, the effect and acceptability of our platform have been investigated on autistic children between 3 and 7 years old and the preliminary acceptance rate of \(\sim \) 78% is observed in our experimental conditions. The scenarios start with simple facial expressions and get more complicated as they continue. The same vision system and fuzzy clustering method of the Non-structured interaction mode are used for automatic evaluation of a participant’s gestures. Lastly, the automatic assessment of imitation quality was compared with the manual video coding results. The Pearson’s r on these equivalent grades were computed as \(\hbox {r}\,=\,0.89\) which shows a sufficient agreement on the automatic and manual scores.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17

Similar content being viewed by others

References

  1. Pantic M, Pentland A, Nijholt A, Huang TS (2007) Human computing and machine understanding of human behavior: a survey. In: Huang TS, Nijholt A, Pantic M, Pentland A (eds) Artificial intelligence for human computing. Lecture notes in computer science, vol 4451. Springer, Berlin. https://doi.org/10.1007/978-3-540-72348-6_3

  2. Valstar MF (2008) Timing is everything: a spatio-temporal approach to the analysis of facial actions. Imperial College London, London

    Google Scholar 

  3. Mavridis N (2015) A review of verbal and non-verbal human–robot interactive communication. Robot Auton Syst 63:22–35

    Article  MathSciNet  Google Scholar 

  4. Tardif C, Lainé F, Rodriguez M, Gepner B (2007) Slowing down presentation of facial movements and vocal sounds enhances facial expression recognition and induces facial-vocal imitation in children with autism. J Autism Dev Disord 37(8):1469–1484

    Article  Google Scholar 

  5. Dawson G, Webb SJ, McPartland J (2005) Understanding the nature of face processing impairment in autism: insights from behavioral and electrophysiological studies. Dev Neuropsychol 27(3):403–424

    Article  Google Scholar 

  6. Baron-Cohen S, Leslie AM, Frith U (1985) Does the autistic child have a "theory of mind"? Cognition 21(1):37–46

    Article  Google Scholar 

  7. Baron-Cohen S (2001) Theory of mind in normal development and autism. Prisme 34(1):74–183

    Google Scholar 

  8. Haviland JM, Lelwica M (1987) The induced affect response: 10-week-old infants’ responses to three emotion expressions. Dev Psychol 23(1):97

    Article  Google Scholar 

  9. Tonks J, Williams WH, Frampton I, Yates P, Slater A (2007) Assessing emotion recognition in 9–15-years olds: preliminary analysis of abilities in reading emotion from faces, voices and eyes. Brain Inj 21(6):623–629

    Article  Google Scholar 

  10. Pouretemad H (2011) Assessment and treatment of joint attention deficits in children with autistic spectrum disorders. Arjmand Book, Tehran (in Persian)

  11. Ingersoll B (2010) Brief report: pilot randomized controlled trial of reciprocal imitation training for teaching elicited and spontaneous imitation to children with autism. J Autism Dev Disord 40(9):1154–1160

    Article  Google Scholar 

  12. Alemi M, Meghdari A, Ghazisaedy M (2015) The impact of social robotics on L2 learners’ anxiety and attitude in English vocabulary acquisition. Int J Soc Robot 7(4):523–535

    Article  Google Scholar 

  13. Tamura T, Yonemitsu S, Itoh A, Oikawa D, Kawakami A, Higashi Y et al (2004) Is an entertainment robot useful in the care of elderly people with severe dementia? J Gerontol Ser Biol Sci Med Sci 59(1):M83–M85

    Article  Google Scholar 

  14. Alemi M, Ghanbarzadeh A, Meghdari A, Moghadam LJ (2016) Clinical application of a humanoid robot in pediatric cancer interventions. Int J Soc Robot 8(5):743–759

    Article  Google Scholar 

  15. Taheri A, Alemi M, Meghdari A, Pouretemad H, Basiri NM, Poorgoldooz P (2015) Impact of humanoid social robots on treatment of a pair of Iranian autistic twins. In: International conference on social robotics. Springer, pp 623–632

  16. Scassellati B, Admoni H, Matarić M (2012) Robots for use in autism research. Annu Rev Biomed Eng 14:275–294

    Article  Google Scholar 

  17. Taheri A, Meghdari A, Alemi M, Pouretemad H, Poorgoldooz P, Roohbakhsh M (2016) Social robots and teaching music to autistic children: myth or reality? In: International conference on social robotics. Springer, pp 541–550

  18. Hopkins IM, Gower MW, Perez TA, Smith DS, Amthor FR, Wimsatt FC, Biasini FJ (2011) Avatar assistant: improving social skills in students with an ASD through a computer-based intervention. J Autism Dev Disord 41(11):1543–1555

    Article  Google Scholar 

  19. Feil-Seifer D, Mataric MJ (2008) B 3 IA: a control architecture for autonomous robot-assisted behavior intervention for children with Autism Spectrum Disorders. In: The 17th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 328–333

  20. Meghdari A, Alemi M, Pour AG, Taheri A (2016) Spontaneous human–robot emotional interaction through facial expressions. In: International conference on social robotics. Springer, pp 351–361

  21. Zacharatos H, Gatzoulis C, Chrysanthou YL (2014) Automatic emotion recognition based on body movement analysis: a survey. IEEE Comput Graph Appl 34(6):35–45

    Article  Google Scholar 

  22. Xiao Y, Zhang Z, Beck A, Yuan J, Thalmann D (2014) Human–robot interaction by understanding upper body gestures. Presence Teleoper Virtual Environ 23(2):133–154

    Article  Google Scholar 

  23. Aly A, Tapus A (2015) Multimodal adapted robot behavior synthesis within a narrative human–robot interaction. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 2986–2993

  24. Kwon DS, Kwak YK, Park JC, Chung MJ, Jee ES, Park KS et al (2007) Emotion interaction system for a service robot. In: The 16th ieee international symposium on robot and human interactive communication (RO-MAN), pp 351–356

  25. Brown L, Howard AM (2014) Gestural behavioral implementation on a humanoid robotic platform for effective social interaction. In: The 23rd IEEE international symposium on robot and human interactive communication (RO-MAN), pp 471–476

  26. Noh JY, Neumann U (1998) A survey of facial modeling and animation techniques. USC technical report, pp 99–705

  27. Mavadati S (2015) Spontaneous facial behavior computing in human machine interaction with applications in autism treatment. Doctoral dissertation, Electrical and Computer Engineering Department, University of Denver, Denver

  28. Halder A, Konar A, Mandal R, Chakraborty A, Bhowmik P, Pal NR, Nagar AK (2013) General and interval type-2 fuzzy face-space approach to emotion recognition. IEEE Trans Syst Man Cybern Syst 43(3):587–605

    Article  Google Scholar 

  29. Dahmane M, Meunier J (2014) Prototype-based modeling for facial expression analysis. IEEE Trans Multimed 16(6):1574–1584

    Article  Google Scholar 

  30. Kotsia I, Pitas I (2007) Facial expression recognition in image sequences using geometric deformation features and support vector machines. IEEE Trans Image Process 16(1):172–187

    Article  MathSciNet  Google Scholar 

  31. Li Y, Wang S, Zhao Y, Ji Q (2013) Simultaneous facial feature tracking and facial expression recognition. IEEE Trans Image Process 22(7):2559–2573

    Article  Google Scholar 

  32. Holthaus P, Wachsmuth S (2013) Direct on-line imitation of human faces with hierarchical ART networks. In: The 22nd IEEE international symposium on robot and human interactive communication (RO-MAN), pp 370–371

  33. Li Y, Mavadati SM, Mahoor MH, Zhao Y, Ji Q (2015) Measuring the intensity of spontaneous facial action units with dynamic Bayesian network. Pattern Recogn 48(11):3417–3427

    Article  Google Scholar 

  34. Chakraborty A, Konar A, Chakraborty UK, Chatterjee A (2009) Emotion recognition from facial expressions and its control using fuzzy logic. IEEE Trans Syst Man Cybern Part A Syst Hum 39(4):726–743

    Article  Google Scholar 

  35. Abdat F, Maaoui C, Pruski A (2011) Human–computer interaction using emotion recognition from facial expression. In: Fifth UKSim european symposium on computer modeling and simulation (EMS), pp 196–201

  36. de Carvalho Santos V, Romero RAF, Coca SRDM (2012) Imitation of facial expressions for a virtual robotic head. In: Robotics symposium and latin american robotics symposium (SBR-LARS), 2012 Brazilian, pp 251–254

  37. Cid F, Prado JA, Bustos P, Nunez P (2013) A real time and robust facial expression recognition and imitation approach for affective human–robot interaction using gabor filtering. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 2188–2193

  38. Chumkamon S, Masato K, Hayashi E (2014) The robot’s eye expression for imitating human facial expression. In: The 11th international conference on electrical engineering/electronics, computer, telecommunications and information technology (ECTI-CON), pp 1–5

  39. Meghdari A, Shouraki SB, Siamy A, Shariati A (2016) The real-time facial imitation by a social humanoid robot. In: The 4th international conference on robotics and mechatronics (ICROM), pp 524–529

  40. Tanaka JW, Wolf JM, Klaiman C, Koenig K, Cockburn J, Herlihy L et al (2010) Using computerized games to teach face recognition skills to children with autism spectrum disorder: the let’s face it! program. J Child Psychol Psychiatry 51(8):944–952

    Article  Google Scholar 

  41. Duquette A, Michaud F, Mercier H (2008) Exploring the use of a mobile robot as an imitation agent with children with low-functioning autism. Auton Robots 24(2):147–157

    Article  Google Scholar 

  42. Salvador MJ, Silver S, Mahoor MH (2015) An emotion recognition comparative study of autistic and typically-developing children using the zeno robot. In: IEEE international conference on robotics and automation (ICRA), pp 6128–6133

  43. Wainer J, Robins B, Amirabdollahian F, Dautenhahn K (2014) Using the humanoid robot KASPAR to autonomously play triadic games and facilitate collaborative play among children with autism. IEEE Trans Auton Ment Dev 6(3):183–199

    Article  Google Scholar 

  44. Hanson D, Mazzei D, Garver C, Ahluwalia A, De Rossi D, Stevenson M, Reynolds K (2012) Realistic humanlike robots for treatment of ASD, social training, and research; shown to appeal to youths with ASD, cause physiological arousal, and increase human-to-human social engagement. In: Proceedings of the 5th ACM international conference on pervasive technologies related to assistive environments (PETRA’12)

  45. Kinect for Windows SDK (2016) https://msdn.microsoft.com/en-us/library/

  46. http://www.robokindrobots.com/ (2016)

  47. Ekman P, Friesen W (1978) Facial action coding system: a technique for the measurement of facial movement. Consulting Psychologists, Palo Alto

  48. Bezdek JC, Ehrlich R, Full W (1984) FCM: the fuzzy c-means clustering algorithm. Comput Geosci 10(2–3):191–203

    Article  Google Scholar 

  49. Popescu M, Keller J, Bezdek J, Zare A (2015) Random projections fuzzy c-means (RPFCM) for big data clustering. In: IEEE international conference on fuzzy systems (FUZZ-IEEE), pp 1–6

  50. Yan J, Ryan M, Power J (1994) Using fuzzy logic: towards intelligent systems, vol 1. Prentice Hall, Upper Saddle River

    Google Scholar 

  51. Minitab INC (2000) MINITAB statistical software. Minitab Release, 13

  52. Giannopulu I, Montreynaud V, Watanabe T (2014) PEKOPPA: a minimalistic toy robot to analyse a listener-speaker situation in neurotypical and autistic children aged 6 years. In: Proceedings of the second international conference on human-agent interaction. ACM, pp 9–16

  53. Taheri A, Meghdari A, Alemi M, Pouretemad H (2017) Human-robot interaction in autism treatment: a case study on three pairs of autistic children as twins, siblings, and classmates. Int J Soc Robot. https://doi.org/10.1007/s12369-017-0433-8

    Google Scholar 

  54. Taheri A, Meghdari A, Alemi M, Pouretemad H (2017) Teaching music to children with autism: a social robotics challenge. Int J Sci Iran Trans G Socio Cognit Eng. https://doi.org/10.24200/SCI.2017.4608

    Google Scholar 

  55. Elahi MT, Korayem AH, Shariati A, Meghdari A, Alemi M, Ahmadi E et al (2017) “Xylotism”: a tablet-based application to teach music to children with autism. In: International conference on social robotics. Springer, Cham, pp 728–738

Download references

Acknowledgements

Our profound gratitude goes to the “Center for the Treatment of Autistic Disorders (CTAD)” and its psychologists for their contributions to the clinical trials with the children with autism. This research was funded by the “Cognitive Sciences and Technology Council” (CSTC) of Iran (http://www.cogc.ir/). We also appreciate the Iranian National Science Foundation (INSF) for their complementary support of the Social & Cognitive Robotics Laboratory (http://en.insf.org/).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ali Meghdari.

Ethics declarations

Funding

This study was funded by the “Cognitive Sciences and Technology Council” (CSTC) of Iran (Grant Number: 95p22)

Conflict of interest

Author Ali Meghdari has received research grants from the “Cognitive Sciences and Technology Council” (CSTC) of Iran. The authors Ali Ghorbandaei Pour, Alireza Taheri, and Minoo Alemi declare that they have no conflict of interest.

Ethical Approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. Ethical approval for the protocol of this study was provided by Iran University of Medical Sciences (No. IR.IUMS.REC.1395.95301469), and the certification for ABA and robot-assisted Therapy with autistic children was received from the Center for the Treatment of Autistic Disorders (CTAD), Iran.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ghorbandaei Pour, A., Taheri, A., Alemi, M. et al. Human–Robot Facial Expression Reciprocal Interaction Platform: Case Studies on Children with Autism. Int J of Soc Robotics 10, 179–198 (2018). https://doi.org/10.1007/s12369-017-0461-4

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-017-0461-4

Keywords

Navigation