ABSTRACT
Social play is essential in human interactions, increasing social bonding, mitigating stress, and relieving anxiety. With advancements in robotics, social robots can employ this role to assist in human-robot interaction scenarios for clinical and healthcare purposes. However, robotic intelligence still needs further development to match the wide spectrum of social behaviors and contexts in human interactions. In this paper, we present our robotic intelligence framework with a mutual learning paradigm in which we apply deep learning based on emotion recognition and behavior perception, through which the robot learns human movements and contexts through the interactive game of charades. Furthermore, we designed a gesture-based social game to provide a more empathetic and engaging social robot for the user. We also created a custom behavior database containing contextual behaviors for the proposed social games. A pilot study was conducted with participants ranging in age from 12 to 19 for a preliminary evaluation.
Supplemental Material
- Ferdous Ahmed, ASM Hossain Bari, and Marina L Gavrilova. 2019. Emotion recognition from body movement. IEEE Access, Vol. 8 (2019), 11761--11781.Google ScholarCross Ref
- Archana Balmik, Mrityunjay Jha, and Anup Nandy. 2021. NAO Robot Teleoperation with Human Motion Recognition. Arabian Journal for Science and Engineering (2021), 1--10.Google Scholar
- Pablo Barros, Doreen Jirak, Cornelius Weber, and Stefan Wermter. 2015. Multimodal emotional state recognition using sequence-dependent deep hierarchical features. Neural Networks, Vol. 72 (2015), 140--151.Google ScholarDigital Library
- Pablo Barros and Stefan Wermter. 2016. Developing crossmodal expression recognition based on a deep neural model. Adaptive behavior, Vol. 24, 5 (2016), 373--396.Google Scholar
- Erdenebileg Batbaatar, Meijing Li, and Keun Ho Ryu. 2019. Semantic-emotion neural network for emotion recognition from text. IEEE Access, Vol. 7 (2019), 111866--111878.Google ScholarCross Ref
- Zhe Cao, Tomas Simon, Shih-En Wei, and Yaser Sheikh. 2017. Realtime multi-person 2d pose estimation using part affinity fields. In Proceedings of the IEEE conference on computer vision and pattern recognition. 7291--7299.Google ScholarCross Ref
- Stéphanie Carlier, Sara Van der Paelt, Femke Ongenae, Femke De Backere, and Filip De Turck. 2019. Using a serious game to reduce stress and anxiety in children with autism spectrum disorder. In Proceedings of the 13th EAI International Conference on Pervasive Computing Technologies for Healthcare. 452--461.Google ScholarDigital Library
- Stéphanie Carlier, Sara Van der Paelt, Femke Ongenae, Femke De Backere, and Filip De Turck. 2020. Empowering children with ASD and their parents: Design of a serious game for anxiety and stress reduction. Sensors, Vol. 20, 4 (2020), 966.Google ScholarCross Ref
- Ankush Chatterjee, Umang Gupta, Manoj Kumar Chinnakotla, Radhakrishnan Srikanth, Michel Galley, and Puneet Agrawal. 2019. Understanding emotions in text using deep learning and big data. Computers in Human Behavior, Vol. 93 (2019), 309--317.Google ScholarDigital Library
- Mingyi Chen, Xuanji He, Jing Yang, and Han Zhang. 2018. 3-D convolutional recurrent neural networks with attention model for speech emotion recognition. IEEE Signal Processing Letters, Vol. 25, 10 (2018), 1440--1444.Google ScholarCross Ref
- Andreia P Costa, Georges Steffgen, Francisco Rodríguez Lera, Aida Nazarikhorram, and Pouyan Ziafati. 2017. Socially assistive robots for teaching emotional abilities to children with autism spectrum disorder. In 3rd Workshop on Child-Robot Interaction at HRI.Google Scholar
- Kerstin Dautenhahn. 2003. Roles and functions of robots in human society: implications from research in autism therapy. Robotica, Vol. 21, 4 (2003), 443--452.Google ScholarDigital Library
- Beatrice de Gelder, AW De Borst, and R Watson. 2015. The perception of emotion in body expressions. Wiley Interdisciplinary Reviews: Cognitive Science, Vol. 6, 2 (2015), 149--158.Google ScholarCross Ref
- Christoph Feichtenhofer, Haoqi Fan, Jitendra Malik, and Kaiming He. 2019. Slowfast networks for video recognition. In Proceedings of the IEEE/CVF international conference on computer vision. 6202--6211.Google ScholarCross Ref
- Christoph Feichtenhofer, Axel Pinz, and Richard Wildes. 2016. Spatiotemporal residual networks for video action recognition. In Advances in Neural Information Processing Systems (NIPS). 3468--3476.Google Scholar
- Panagiotis Giannopoulos, Isidoros Perikos, and Ioannis Hatzilygeroudis. 2018. Deep learning approaches for facial emotion recognition: A case study on FER-2013. In Advances in hybridization of intelligent methods. Springer, 1--16.Google Scholar
- SoftBank Robotics Group. 2022. Pepper the humanoid and programmable robot: SoftBank Robotics. https://us.softbankrobotics.com/pepper Retrieved December 7, 2022 fromGoogle Scholar
- Hatice Gunes and Massimo Piccardi. 2006. A bimodal face and body gesture database for automatic analysis of human nonverbal affective behavior. In 18th International conference on pattern recognition (ICPR'06), Vol. 1. IEEE, 1148--1153.Google ScholarDigital Library
- Hatice Gunes and Massimo Piccardi. 2008. Automatic temporal segment detection and affect recognition from face and body display. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), Vol. 39, 1 (2008), 64--84.Google ScholarDigital Library
- Chaudhary Muhammad Aqdus Ilyas, Rita Nunes, Kamal Nasrollahi, Matthias Rehm, and Thomas B Moeslund. 2021. Deep Emotion Recognition through Upper Body Movements and Facial Expression.. In VISIGRAPP (5: VISAPP). 669--679.Google Scholar
- Hifza Javed, Rachael Burns, Myounghoon Jeon, Ayanna M Howard, and Chung Hyuk Park. 2019. A robotic framework to facilitate sensory experiences for children with autism spectrum disorder: A preliminary study. ACM Transactions on Human-Robot Interaction (THRI), Vol. 9, 1 (2019), 1--26.Google Scholar
- Hifza Javed and Chung Hyuk Park. 2019. Interactions with an empathetic agent: regulating emotions and improving engagement in autism. IEEE robotics & automation magazine, Vol. 26, 2 (2019), 40--48.Google Scholar
- Hifza Javed and Chung Hyuk Park. 2022. Promoting Social Engagement with a Multi-Role Dancing Robot for In-Home Autism Care. Frontiers in Robotics and AI (2022), 161.Google Scholar
- Will Kay, Joao Carreira, Karen Simonyan, Brian Zhang, Chloe Hillier, Sudheendra Vijayanarasimhan, Fabio Viola, Tim Green, Trevor Back, Paul Natsev, et al. 2017. The kinetics human action video dataset. arXiv preprint arXiv:1705.06950 (2017).Google Scholar
- Jessica L Lakin, Valerie E Jefferis, Clara Michelle Cheng, and Tanya L Chartrand. 2003. The chameleon effect as social glue: Evidence for the evolutionary significance of nonconscious mimicry. Journal of nonverbal behavior, Vol. 27, 3 (2003), 145--162.Google ScholarCross Ref
- Katie Lang, Marcela Marin Dapelo, Mizanur Khondoker, Robin Morris, Simon Surguladze, Janet Treasure, and Kate Tchanturia. 2015. Exploring emotion recognition in adults and adolescents with anorexia nervosa using a body motion paradigm. European Eating Disorders Review, Vol. 23, 4 (2015), 262--268.Google ScholarCross Ref
- Naoki Miura, Motoaki Sugiura, Makoto Takahashi, Tomohisa Moridaira, Atsushi Miyamoto, Yoshihiro Kuroki, and Ryuta Kawashima. 2008. An advantage of bipedal humanoid robot on the empathy generation: A neuroimaging study. In 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2465--2470.Google ScholarCross Ref
- Samira Rasouli, Garima Gupta, Elizabeth Nilsen, and Kerstin Dautenhahn. 2022. Potential applications of social robots in robot-assisted interventions for social anxiety. International Journal of Social Robotics (2022), 1--32.Google ScholarCross Ref
- Jakob Reinhardt, Aaron Pereira, Dario Beckert, and Klaus Bengler. 2017. Dominance and movement cues of robot motion: A user study on trust and predictability. In 2017 IEEE international conference on systems, man, and cybernetics (SMC). IEEE, 1493--1498.Google ScholarDigital Library
- Daniel J Ricks and Mark B Colton. 2010. Trends and considerations in robot-assisted autism therapy. In 2010 IEEE international conference on robotics and automation. IEEE, 4354--4359.Google ScholarCross Ref
- Nicu Sebe, Ira Cohen, and Thomas S Huang. 2005. Multimodal emotion recognition. In Handbook of pattern recognition and computer vision. World Scientific, 387--409.Google Scholar
- Dag Sverre Syrdal, Kerstin Dautenhahn, Kheng Lee Koay, and Michael L Walters. 2009. The negative attitudes towards robots scale and reactions to robot behaviour in a live human-robot interaction study. Adaptive and emergent behaviour and complex systems (2009).Google Scholar
- Adriana Tapus, Mataric Maja, and Brian Scassellatti. 2007. The grand challenges in socially assistive robotics. IEEE Robotics and Automation Magazine, Vol. 14, 1 (2007), N--A.Google ScholarCross Ref
- Paweł Tarnowski, Marcin Kołodziej, Andrzej Majkowski, and Remigiusz J Rak. 2017. Emotion recognition using facial expressions. Procedia Computer Science, Vol. 108 (2017), 1175--1184.Google ScholarCross Ref
- Yao-Hung Hubert Tsai, Shaojie Bai, Paul Pu Liang, J Zico Kolter, Louis-Philippe Morency, and Ruslan Salakhutdinov. 2019. Multimodal transformer for unaligned multimodal language sequences. In Proceedings of the conference. Association for Computational Linguistics. Meeting, Vol. 2019. NIH Public Access, 6558.Google Scholar
- Iain Werry, Kerstin Dautenhahn, and William Harwin. 2001. Investigating a robot as a therapy partner for children with autism. Procs AAATE 2001, (2001).Google Scholar
- Baijun Xie, Jonathan C Kim, and Chung Hyuk Park. 2020. Musical emotion recognition with spectral feature extraction based on a sinusoidal model with model-based and deep-learning approaches. Applied Sciences, Vol. 10, 3 (2020), 902.Google ScholarCross Ref
- Baijun Xie and Chung Hyuk Park. 2021. Empathetic robot with transformer-based dialogue agent. In 2021 18th International Conference on Ubiquitous Robots (UR). IEEE, 290--295.Google ScholarCross Ref
- Baijun Xie, Mariia Sidulova, and Chung Hyuk Park. 2021. Robust multimodal emotion recognition from conversation with transformer-based crossmodality fusion. Sensors, Vol. 21, 14 (2021), 4913.Google ScholarCross Ref
- Unai Zabala, Igor Rodriguez, José María Martínez-Otzeta, and Elena Lazkano. 2022. Modeling and evaluating beat gestures for social robots. Multimedia Tools and Applications, Vol. 81, 3 (2022), 3421--3438.Google ScholarDigital Library
- Zhijun Zhang, Yaru Niu, Ziyi Yan, and Shuyang Lin. 2018. Real-time whole-body imitation by humanoid robots and task-oriented teleoperation using an analytical mapping method and quantitative evaluation. Applied Sciences, Vol. 8, 10 (2018), 2005.Google ScholarCross Ref
Index Terms
- "Can You Guess My Moves?: Playing Charades with a Humanoid Robot Employing Mutual Learning with Emotional Intelligence
Recommendations
Feel Me If You Can: The Effect of Robot Types and Robot's Tactility Types on Users' Perception toward a Robot
HRI '17: Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot InteractionIn order to investigate the effect of robot types and robot's tactility types on a users' perception toward the robot, we conducted a 2(robot types: autonomous robot vs. telepresence robot) x 2(tactility types: human-like vs. product-like) within-...
Playing the system: using frame analysis to understand online play
Futureplay '10: Proceedings of the International Academic Conference on the Future of Game Design and TechnologyThis paper outlines the different ways in which people play in and with digital games, virtual worlds and social media. People engage in play individually, yet it is often social. This paper explores combining the personal with the social and the ...
The many faces of sociability and social play in games
MindTrek '09: Proceedings of the 13th International MindTrek Conference: Everyday Life in the Ubiquitous EraIn the past, social interaction has been discussed mostly in the context of multiplayer games, ignoring the implicit forms of sociability in single player games. This paper distinguishes between the sociability around the playing of a game and the ...
Comments