Abstract
Basic turn-taking and imitation skills are imperative for effective communication and social interaction (Nehaniv in Imitation and Social Learning in Robots, Springer, New York, 2007). Recently, research has demonstrated that interactive games using turn-taking and imitation have yielded positive results with autistic children who have impaired communication or social skills (Barakova and Brok in Proceedings of the 9th International Conference on Entertainment Computing, pp. 115–126, 2010). This paper describes a robot that plays interactive imitation games using hand and face tracking. The robot is equipped with a head and two arms, each with two degrees of freedom, and a camera. We trained a human hands detector and subsequently, used this detector along with a standard face tracker to create two autonomous interactive games: single-player (“Imitate Me, Imitate You”) and two-player (“Pass the Pose”.) Additionally, we implemented a third setting in which the robot is teleoperated by remote control. In “Imitate Me, Imitate You”, the robot has both passive and active game modes. In the passive mode, the robot waits for the child to initiate an interaction by raising one or both hands. In the second game mode, the robot initiates interactions. The “Pass the Pose” game engages two children in cooperative play by enlisting the robot as a mediator between two children alternately initiating and imitating poses. These games are designed to increase attention, promote turn-taking skills and encourage child-led verbal and non-verbal communication through simple imitative play. This research makes two specific contributions: (1) We present a low-cost robot design which measures and adapts to a child’s actions during interactive games and, (2) we train, test and make freely available, a new hand detector, based on Haar-like features, which is usable in various kinds of human-robot interactions. We present proof-of-concept experiments with a group of typically developing children.
Similar content being viewed by others
References
Nehaniv C (2007) Synchrony and turn-taking as communicative mechanisms. In: Dautenhahn K, Nehaniv C (eds) Imitation and social learning in robots, humans and animals. Springer, Berlin
Barakova EI, Brok JCJ (2010) Engaging autistic children in imitation and turn-taking games with multiagent system of interactive lighting blocks. In: Proceedings of the 9th international conference on entertainment computing, pp 115–126
Dautenhahn K (1999) Robots as social actors: Aurora and the case of autism. In: Proc cognitive technology conference, pp 359–374
Welch KC, Lahiri U, Warren Z, Sarkar N (2010) An approach to the design of socially acceptable robots for children with autism spectrum disorders. Int J Soc Robot 391–403
Goldstein H, Thiemann KS (2001) Social stories, written text cues, and video feedback: effects on social communication of children with autism. J Appl Behav Anal 34:425–446
Dautenhahn K (2000) Design issues on interactive environments for children with autism. In: Proceedings international conference on disability, virtual reality and associated technologies, pp 153–161
Boccanfuso L, O’Kane JM (2010) Adaptive robot design with hand and face tracking for use in autism therapy. In: Proceedings of the second international conference on social robotics, ICSR’10. Springer, Berlin, Heidelberg, pp 265–274
(2009) Prevalence of autism spectrum disorders, autism and developmental disabilities monitoring network, United States, 2006. Surveill Summ 58(10)
Shimabukuro TT, Grosse SD, Rice C (2008) Medical expenditures for children with an autism spectrum disorder in a privately insured population. J Autism Dev Disord 38(3):546–552
Caprino F, Besio S, Laudanna E (2010) Using robots in education and therapy sessions for children with disabilities: guidelines for teachers and rehabilitation professionals. In: Computers helping people with special needs, vol 6179, pp 511–518
Marti P, Pollini A, Rullo A, Shibata T (2005) Engaging with artificial pets. In: Proc conference on European association of cognitive ergonomics, pp 99–106
Lusher D, Castiello U, Pierno AC, Maria M (2008) Robotic movement elicits visuomotor priming in children with autism. Neuropsychologia 46:448–454
Duquette A, Mercier H, Michaud F (2006) Investigating the use of a mobile robotic toy as an imitation agent for children with autism. In: International conference on epigenetic robotics
Dautenhahn K, Werry I (2000) Issues of robot-human interaction dynamics in the rehabilitation of children with autism. In: Proc international conference on the simulation of adaptive behavior, pp 519–528
Robins B, Dautenhahn K, Dickerson P (2009) From isolation to communication: a case study evaluation of robot assisted play for children with autism with a minimally expressive humanoid robot. In: Proc international conference on advances in computer-human interactions.
Kozima H, Nakagawa C, Yasuda Y (2007) Children-robot interaction: a pilot study in autism therapy. Prog Brain Res 164:385–400
Kozima H, Michalowski M, Nakagawa C (2009) Keepon. Int J Soc Robot 1:3–18
Tapus A, Tapus C, Matarić MJ (2008) User-robot personality matching and assistive robot behavior adaptation for post-stroke rehabilitation therapy. Intell Serv Rob J (April):169–183
Patrizia M, Claudio M, Leonardo G, Alessandro P (2009) A robotic toy for children with special needs: from requirements to design. In: Proc IEEE international conference on rehabilitation robotics, pp 918–923
Ferrari E, Robins B, Dautenhahn K (2009) Therapeutic and educational objectives in robot assisted play for children with autism. In: Proc IEEE international symposium on robot and human interactive communication
Robins B, Dautenhahn K, Boekhorst R, Billard A (2004) Robots as assistive technology—does appearance matter. In: Proc IEEE international workshop on robot and human interactive communication, pp 277–282
Harrington K, Fu Q, Lu W, Fischer G, Su H, Dickstein-Fischer H (2010) Cable-driven elastic parallel humanoid head with face tracking for autism spectrum disorder interventions. In: Proceedings of IEEE engineering in biology and medicine conference, Buenos Aires, Argentina
Prigent A, Estraillier P, DaSilva MP, Courboulay V (2009) Fast, low resource, head detection and tracking for interactive applications. Psychol J 7:243–264
Bradski G (2000) The OpenCV library. Dr. Dobb’s J Softw Tools (November):120–126
Lienhart R, Maydt J (2002) An extended set of Haar-like features for rapid object detection. In: Proc IEEE international conference on image processing, pp 900–903
Schapire RE (2003) The boosting approach to machine learning: An overview. In: Denison DD, Hansen MH, Holmes C, Mallick B, Yu B (eds) Nonlinear estimation and classification. Springer, Berlin
Viola P, Jones M (2001) Rapid object detection using a boosted cascade of simple features. In: Proc IEEE computer society conference on computer vision and pattern recognition, vol 1, pp 511–518
Viola P, Jones M (2004) Robust real-time face detection. Int J Comput Vis 57(2):137–154
Dadgostar F, Barczak ALC (2005) Real-time hand tracking using a set of co-operative classifiers based on Haar-like features. Res Lett Inf Math Sci 7:29–42
Huttenlocher D, Zisserman A, Buehler P, Everingham M (2008) Long term arm and hand tracking for continuous sign language tv broadcasts. In: British machine vision conference
Wachs J (2011) Enhanced human computer interface through webcam image processing library—aGest.xml, March 2011
Bradski GR (1998) Computer vision face tracking for use in a perceptual user interface. Interface 2(2):12–21
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Boccanfuso, L., O’Kane, J.M. CHARLIE : An Adaptive Robot Design with Hand and Face Tracking for Use in Autism Therapy. Int J of Soc Robotics 3, 337–347 (2011). https://doi.org/10.1007/s12369-011-0110-2
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-011-0110-2