ABSTRACT
Sparkybot is a novel robotic platform designed to enrich children-robot interaction by leveraging embodied agent technology. The platform allows children to customize the interaction characters and behaviors of the robot, providing a more personalized and engaging experience. Two scenarios are demonstrated to showcase how children can customize and interact with the robot. Our goal is to provide a platform that enables children to interact with robots in a more creative and engaging way.
Supplemental Material
- Patrícia Alves-Oliveira, Patrícia Arriaga, Ana Paiva, and Guy Hoffman. 2017. Yolo, a robot for creativity: A co-design study with children. In Proceedings of the 2017 Conference on Interaction Design and Children. 423–429.Google ScholarDigital Library
- Tony Belpaeme, James Kennedy, Aditi Ramachandran, Brian Scassellati, and Fumihide Tanaka. 2018. Social robots for education: A review. Science robotics 3, 21 (2018), eaat5954.Google Scholar
- Arzu Guneysu Ozgur, Ali Reza Majlesi, Victor Taburet, Sebastiaan Meijer, Iolanda Leite, and Sanna Kuoppamäki. 2022. Designing tangible robot mediated co-located games to enhance social inclusion for neurodivergent children. In Interaction design and children. 536–543.Google Scholar
- Lukas Hostettler, Ayberk Özgür, Séverin Lemaignan, Pierre Dillenbourg, and Francesco Mondada. 2016. Real-time high-accuracy 2d localization with structured patterns. In 2016 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 4536–4543.Google ScholarDigital Library
- Wenlong Huang, Fei Xia, Ted Xiao, Harris Chan, Jacky Liang, Pete Florence, Andy Zeng, Jonathan Tompson, Igor Mordatch, Yevgen Chebotar, 2022. Inner monologue: Embodied reasoning through planning with language models. arXiv preprint arXiv:2207.05608 (2022).Google Scholar
- Haipeng Mi, Aleksander Krzywinski, Masanori Sugimoto, and Weiqin Chen. 2010. RoboStory: A tabletop mixed reality framework for children’s role play storytelling. In Proceedings of the 1st International Workshop on Interactive Storytelling for Children (ACM IDC’10).Google Scholar
- Aoi Minamoto and Zitao Zhang. 2023. Aimoji, an Affordable Interaction Kit That Upcycles Used Toy as Companion Robot. In Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction (Stockholm, Sweden) (HRI ’23). Association for Computing Machinery, New York, NY, USA, 798–801. https://doi.org/10.1145/3568294.3580186Google ScholarDigital Library
- Arzu Guneysu Ozgur, Hala Khodr, Mehdi Akeddar, Michael Roust, and Pierre Dillenbourg. 2022. Designing Online Multiplayer Games with Haptically and Virtually Linked Tangible Robots to Enhance Social Interaction in Therapy. In 2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 358–364.Google ScholarDigital Library
- Fumihide Tanaka, Javier R Movellan, Bret Fortenberry, and Kazuki Aisaka. 2006. Daily HRI evaluation at a classroom environment: reports from dance interaction experiments. In Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction. 3–9.Google ScholarDigital Library
- Nazgul Tazhigaliyeva, Yerassyl Diyas, Dmitriy Brakk, Yernar Aimambetov, and Anara Sandygulova. 2016. Learning with or from the robot: exploring robot roles in educational context with children. In Social Robotics: 8th International Conference, ICSR 2016, Kansas City, MO, USA, November 1-3, 2016 Proceedings 8. Springer, 650–659.Google ScholarCross Ref
Index Terms
- Sparkybot:An Embodied AI Agent-Powered Robot with Customizable Characters andInteraction Behavior for Children
Recommendations
"I'm Not Touching You. It's The Robot!": Inclusion Through A Touch-Based Robot Among Mixed-Visual Ability Children
HRI '24: Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot InteractionChildren with visual impairments often struggle to fully participate in group activities due to limited access to visual cues. They have difficulty perceiving what is happening, when, and how to act-leading to children with and without visual impairments ...
Enabling Embodied Conversational Agents to Respond to Nonverbal Behavior of the Communication Partner
Human-Computer Interaction. User Experience and BehaviorAbstractHumans communicate on three levels: words, paralanguage, and nonverbal. While conversational agents focus mainly on the interpretation of words that are being spoken, recently the focus has also shifted to how we say those words with our tone, ...
Robot’s adaptive emotional feedback sustains children’s social engagement and promotes their vocabulary learning: a long-term child–robot interaction study
In this article, we present an emotion and memory model for a social robot. The model allowed the robot to create a memory account of a child’s emotional events over four individual sessions. The robot then adapted its behaviour based on the developed ...
Comments