Abstract
It is important for socially assistive robots to be able to recognize when a user needs and wants help, and they must be able to do so in a real-time manner so that they can provide timely assistance. We propose an architecture that uses social cues to determine when a robot should provide assistance. Based on a multimodal fusion of eye gaze and language modalities, our architecture is trained and evaluated on data collected in a robot-assisted Lego building task. By focusing on social cues, our architecture has minimal dependencies on the specifics of a given task, enabling it to be applied in many different contexts. Enabling a social robot to recognize a user’s needs through social cues can help it to adapt to user behaviors and preferences, which in turn will lead to improved user experiences.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Langer, A., Feingold-Polak, R., Mueller, O., Kellmeyer, P., Levy-Tzedek, S.: Trust in socially assistive robots: considerations for use in rehabilitation. Neurosci. Biobehav. Rev. 104, 231–239 (2019)
Greczek, J.: Encouraging user autonomy through robot-mediated intervention. In: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts, pp. 189–190 (2015)
Wilson, J.R., Lee, N.Y., Saechao, A., Tickle-Degnen, L., Scheutz, M.: Supporting human autonomy in a robot-assisted medication sorting task. Int. J. Social Rob. 10(5), 621–641 (2018)
Görür, O.C., Rosman, B.S., Hoffman, G., Albayrak, S.: Toward integrating theory of mind into adaptive decision-making of social robots to understand human intention. In: Workshop on Intentions in HRI at ACM/IEEE International Conference on Human-Robot Interaction (HRI 2017) (2017)
Sidner, C.L., Lee, C., Kidd, C.D., Lesh, N., Rich, C.: Explorations in engagement for humans and robots. Artif. Intell. 166(1–2), 140–164 (2005)
Bohus, D., Saw, C.W., Horvitz, E.: Directions robot: In-the-wild experiences and lessons learned. In: Proceedings of the 2014 International Conference on Autonomous Agents and Multi-Agent Systems. International Foundation for Autonomous Agents and Multiagent Systems, pp. 637–644 (2014)
Huang, C.-M., Andrist, S., Sauppé, A., Mutlu, B.: Using gaze patterns to predict task intent in collaboration. Front. Psychol. 6, 1049 (2015)
Matarić, M.J., Eriksson, J., Feil-Seifer, D.J., Winstein, C.J.: Socially assistive robotics for post-stroke rehabilitation. J. NeuroEng. Rehabil. 4(1), 1–9 (2007)
Polak, R.F., Tzedek, S.L.: Social robot for rehabilitation: Expert clinicians and post-stroke patients’ evaluation following a long-term intervention. In: Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, pp. 151–160 (2020)
Wilson, J.R., Tickle-Degnen, L., Scheutz, M.: Challenges in designing a fully autonomous socially assistive robot for people with parkinson’s disease. ACM Trans. Human-Robot Interact. (THRI) 9(3), 1–31 (2020)
Kory-Westlund, J.M., Breazeal, C.: A long-term study of young children’s rapport, social emulation, and language learning with a peer-like robot playmate in preschool. Front. Rob. AI 6, 81 (2019)
Fasola, J., Matarić, M.J.: A socially assistive robot exercise coach for the elderly. J. Hum.-Robot Interact. 2(2), 3–32 (2013)
Fischinger, D.: Hobbit, a care robot supporting independent living at home: first prototype and lessons learned. Rob. Auton. Syst. 75, 60–78 (2016)
Wilson, J.R., Wransky, R., Tierno, J.: General approach to automatically generating need-based assistance. In: Proceedings of the Sixth Annual Conference on Advances in Cognitive Systems (2018)
Kurylo, U., Wilson, J.R.: Using human eye gaze patterns as indicators of need for assistance from a socially assistive robot. In: Salichs, M.A., et al. (eds.) ICSR 2019. LNCS (LNAI), vol. 11876, pp. 200–210. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-35888-4_19
Shao, M., Alves, S.F.D.R., Ismail, O., Zhang, X., Nejat, G., Benhabib, B.: You are doing great! only one rep left: an affect-aware social robot for exercising. In: 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), pp. 3811–3817. IEEE (2019)
Graterol, W., Diaz-Amado, J., Cardinale, Y., Dongo, I., Lopes-Silva, E., Santos-Libarino, C.: Emotion detection for social robots based on NLP transformers and an emotion ontology. Sensors 21(4), 1322 (2021)
Nau, D., Cao, Y., Lotem, A., Munoz-Avila, H.: Shop: simple hierarchical ordered planner. In: Proceedings of the 16th International Joint Conference on Artificial Intelligence, vol. 2, pp. 968–973 (1999)
Bohus, D., et al.: Platform for situated intelligence (2021)
Baltrušaitis, T., Robinson, P., Morency, L.-P.: Openface: an open source facial behavior analysis toolkit. In: 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), pp. 1–10. IEEE, 2016
Csikszentmihalyi, M., Csikzentmihaly, M.: Flow: The Psychology of Optimal Experience, vol. 1990. Harper & Row New York (1990)
Rogers, J., Holm, M.: Performance assessment of self-care skills (pass-home) version 3.1. University of Pittsburgh, Pittsburgh (1994)
Acknowledgment
We thank Ulyana Kurylo, Alex Reneau, Roxy Wilcox, Kevin Hou, and Anzu Hakone for their contributions.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Wilson, J.R., Aung, P.T., Boucher, I. (2022). When to Help? A Multimodal Architecture for Recognizing When a User Needs Help from a Social Robot. In: Cavallo, F., et al. Social Robotics. ICSR 2022. Lecture Notes in Computer Science(), vol 13817. Springer, Cham. https://doi.org/10.1007/978-3-031-24667-8_23
Download citation
DOI: https://doi.org/10.1007/978-3-031-24667-8_23
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-24666-1
Online ISBN: 978-3-031-24667-8
eBook Packages: Computer ScienceComputer Science (R0)