Skip to main content
Log in

CHARLIE : An Adaptive Robot Design with Hand and Face Tracking for Use in Autism Therapy

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Basic turn-taking and imitation skills are imperative for effective communication and social interaction (Nehaniv in Imitation and Social Learning in Robots, Springer, New York, 2007). Recently, research has demonstrated that interactive games using turn-taking and imitation have yielded positive results with autistic children who have impaired communication or social skills (Barakova and Brok in Proceedings of the 9th International Conference on Entertainment Computing, pp. 115–126, 2010). This paper describes a robot that plays interactive imitation games using hand and face tracking. The robot is equipped with a head and two arms, each with two degrees of freedom, and a camera. We trained a human hands detector and subsequently, used this detector along with a standard face tracker to create two autonomous interactive games: single-player (“Imitate Me, Imitate You”) and two-player (“Pass the Pose”.) Additionally, we implemented a third setting in which the robot is teleoperated by remote control. In “Imitate Me, Imitate You”, the robot has both passive and active game modes. In the passive mode, the robot waits for the child to initiate an interaction by raising one or both hands. In the second game mode, the robot initiates interactions. The “Pass the Pose” game engages two children in cooperative play by enlisting the robot as a mediator between two children alternately initiating and imitating poses. These games are designed to increase attention, promote turn-taking skills and encourage child-led verbal and non-verbal communication through simple imitative play. This research makes two specific contributions: (1) We present a low-cost robot design which measures and adapts to a child’s actions during interactive games and, (2) we train, test and make freely available, a new hand detector, based on Haar-like features, which is usable in various kinds of human-robot interactions. We present proof-of-concept experiments with a group of typically developing children.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Nehaniv C (2007) Synchrony and turn-taking as communicative mechanisms. In: Dautenhahn K, Nehaniv C (eds) Imitation and social learning in robots, humans and animals. Springer, Berlin

    Chapter  Google Scholar 

  2. Barakova EI, Brok JCJ (2010) Engaging autistic children in imitation and turn-taking games with multiagent system of interactive lighting blocks. In: Proceedings of the 9th international conference on entertainment computing, pp 115–126

    Google Scholar 

  3. Dautenhahn K (1999) Robots as social actors: Aurora and the case of autism. In: Proc cognitive technology conference, pp 359–374

    Google Scholar 

  4. Welch KC, Lahiri U, Warren Z, Sarkar N (2010) An approach to the design of socially acceptable robots for children with autism spectrum disorders. Int J Soc Robot 391–403

  5. Goldstein H, Thiemann KS (2001) Social stories, written text cues, and video feedback: effects on social communication of children with autism. J Appl Behav Anal 34:425–446

    Article  Google Scholar 

  6. Dautenhahn K (2000) Design issues on interactive environments for children with autism. In: Proceedings international conference on disability, virtual reality and associated technologies, pp 153–161

    Google Scholar 

  7. Boccanfuso L, O’Kane JM (2010) Adaptive robot design with hand and face tracking for use in autism therapy. In: Proceedings of the second international conference on social robotics, ICSR’10. Springer, Berlin, Heidelberg, pp 265–274

    Google Scholar 

  8. (2009) Prevalence of autism spectrum disorders, autism and developmental disabilities monitoring network, United States, 2006. Surveill Summ 58(10)

  9. Shimabukuro TT, Grosse SD, Rice C (2008) Medical expenditures for children with an autism spectrum disorder in a privately insured population. J Autism Dev Disord 38(3):546–552

    Article  Google Scholar 

  10. Caprino F, Besio S, Laudanna E (2010) Using robots in education and therapy sessions for children with disabilities: guidelines for teachers and rehabilitation professionals. In: Computers helping people with special needs, vol 6179, pp 511–518

    Chapter  Google Scholar 

  11. Marti P, Pollini A, Rullo A, Shibata T (2005) Engaging with artificial pets. In: Proc conference on European association of cognitive ergonomics, pp 99–106

    Google Scholar 

  12. Lusher D, Castiello U, Pierno AC, Maria M (2008) Robotic movement elicits visuomotor priming in children with autism. Neuropsychologia 46:448–454

    Article  Google Scholar 

  13. Duquette A, Mercier H, Michaud F (2006) Investigating the use of a mobile robotic toy as an imitation agent for children with autism. In: International conference on epigenetic robotics

    Google Scholar 

  14. Dautenhahn K, Werry I (2000) Issues of robot-human interaction dynamics in the rehabilitation of children with autism. In: Proc international conference on the simulation of adaptive behavior, pp 519–528

    Google Scholar 

  15. Robins B, Dautenhahn K, Dickerson P (2009) From isolation to communication: a case study evaluation of robot assisted play for children with autism with a minimally expressive humanoid robot. In: Proc international conference on advances in computer-human interactions.

    Google Scholar 

  16. Kozima H, Nakagawa C, Yasuda Y (2007) Children-robot interaction: a pilot study in autism therapy. Prog Brain Res 164:385–400

    Article  Google Scholar 

  17. Kozima H, Michalowski M, Nakagawa C (2009) Keepon. Int J Soc Robot 1:3–18

    Article  Google Scholar 

  18. Tapus A, Tapus C, Matarić MJ (2008) User-robot personality matching and assistive robot behavior adaptation for post-stroke rehabilitation therapy. Intell Serv Rob J (April):169–183

  19. Patrizia M, Claudio M, Leonardo G, Alessandro P (2009) A robotic toy for children with special needs: from requirements to design. In: Proc IEEE international conference on rehabilitation robotics, pp 918–923

    Chapter  Google Scholar 

  20. Ferrari E, Robins B, Dautenhahn K (2009) Therapeutic and educational objectives in robot assisted play for children with autism. In: Proc IEEE international symposium on robot and human interactive communication

    Google Scholar 

  21. Robins B, Dautenhahn K, Boekhorst R, Billard A (2004) Robots as assistive technology—does appearance matter. In: Proc IEEE international workshop on robot and human interactive communication, pp 277–282

    Google Scholar 

  22. Harrington K, Fu Q, Lu W, Fischer G, Su H, Dickstein-Fischer H (2010) Cable-driven elastic parallel humanoid head with face tracking for autism spectrum disorder interventions. In: Proceedings of IEEE engineering in biology and medicine conference, Buenos Aires, Argentina

    Google Scholar 

  23. Prigent A, Estraillier P, DaSilva MP, Courboulay V (2009) Fast, low resource, head detection and tracking for interactive applications. Psychol J 7:243–264

    Google Scholar 

  24. Bradski G (2000) The OpenCV library. Dr. Dobb’s J Softw Tools (November):120–126

  25. Lienhart R, Maydt J (2002) An extended set of Haar-like features for rapid object detection. In: Proc IEEE international conference on image processing, pp 900–903

    Chapter  Google Scholar 

  26. Schapire RE (2003) The boosting approach to machine learning: An overview. In: Denison DD, Hansen MH, Holmes C, Mallick B, Yu B (eds) Nonlinear estimation and classification. Springer, Berlin

    Google Scholar 

  27. Viola P, Jones M (2001) Rapid object detection using a boosted cascade of simple features. In: Proc IEEE computer society conference on computer vision and pattern recognition, vol 1, pp 511–518

    Google Scholar 

  28. Viola P, Jones M (2004) Robust real-time face detection. Int J Comput Vis 57(2):137–154

    Article  Google Scholar 

  29. Dadgostar F, Barczak ALC (2005) Real-time hand tracking using a set of co-operative classifiers based on Haar-like features. Res Lett Inf Math Sci 7:29–42

    Google Scholar 

  30. Huttenlocher D, Zisserman A, Buehler P, Everingham M (2008) Long term arm and hand tracking for continuous sign language tv broadcasts. In: British machine vision conference

    Google Scholar 

  31. Wachs J (2011) Enhanced human computer interface through webcam image processing library—aGest.xml, March 2011

  32. Bradski GR (1998) Computer vision face tracking for use in a perceptual user interface. Interface 2(2):12–21

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Laura Boccanfuso.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Boccanfuso, L., O’Kane, J.M. CHARLIE : An Adaptive Robot Design with Hand and Face Tracking for Use in Autism Therapy. Int J of Soc Robotics 3, 337–347 (2011). https://doi.org/10.1007/s12369-011-0110-2

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-011-0110-2

Keywords

Navigation