Skip to main content

When to Help? A Multimodal Architecture for Recognizing When a User Needs Help from a Social Robot

  • Conference paper
  • First Online:
Social Robotics (ICSR 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13817))

Included in the following conference series:

Abstract

It is important for socially assistive robots to be able to recognize when a user needs and wants help, and they must be able to do so in a real-time manner so that they can provide timely assistance. We propose an architecture that uses social cues to determine when a robot should provide assistance. Based on a multimodal fusion of eye gaze and language modalities, our architecture is trained and evaluated on data collected in a robot-assisted Lego building task. By focusing on social cues, our architecture has minimal dependencies on the specifics of a given task, enabling it to be applied in many different contexts. Enabling a social robot to recognize a user’s needs through social cues can help it to adapt to user behaviors and preferences, which in turn will lead to improved user experiences.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://github.com/mozilla/DeepSpeech.

  2. 2.

    https://youtu.be/eW2uVBgi9r4.

References

  1. Langer, A., Feingold-Polak, R., Mueller, O., Kellmeyer, P., Levy-Tzedek, S.: Trust in socially assistive robots: considerations for use in rehabilitation. Neurosci. Biobehav. Rev. 104, 231–239 (2019)

    Article  Google Scholar 

  2. Greczek, J.: Encouraging user autonomy through robot-mediated intervention. In: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts, pp. 189–190 (2015)

    Google Scholar 

  3. Wilson, J.R., Lee, N.Y., Saechao, A., Tickle-Degnen, L., Scheutz, M.: Supporting human autonomy in a robot-assisted medication sorting task. Int. J. Social Rob. 10(5), 621–641 (2018)

    Article  Google Scholar 

  4. Görür, O.C., Rosman, B.S., Hoffman, G., Albayrak, S.: Toward integrating theory of mind into adaptive decision-making of social robots to understand human intention. In: Workshop on Intentions in HRI at ACM/IEEE International Conference on Human-Robot Interaction (HRI 2017) (2017)

    Google Scholar 

  5. Sidner, C.L., Lee, C., Kidd, C.D., Lesh, N., Rich, C.: Explorations in engagement for humans and robots. Artif. Intell. 166(1–2), 140–164 (2005)

    Article  Google Scholar 

  6. Bohus, D., Saw, C.W., Horvitz, E.: Directions robot: In-the-wild experiences and lessons learned. In: Proceedings of the 2014 International Conference on Autonomous Agents and Multi-Agent Systems. International Foundation for Autonomous Agents and Multiagent Systems, pp. 637–644 (2014)

    Google Scholar 

  7. Huang, C.-M., Andrist, S., Sauppé, A., Mutlu, B.: Using gaze patterns to predict task intent in collaboration. Front. Psychol. 6, 1049 (2015)

    Article  Google Scholar 

  8. Matarić, M.J., Eriksson, J., Feil-Seifer, D.J., Winstein, C.J.: Socially assistive robotics for post-stroke rehabilitation. J. NeuroEng. Rehabil. 4(1), 1–9 (2007)

    Article  Google Scholar 

  9. Polak, R.F., Tzedek, S.L.: Social robot for rehabilitation: Expert clinicians and post-stroke patients’ evaluation following a long-term intervention. In: Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, pp. 151–160 (2020)

    Google Scholar 

  10. Wilson, J.R., Tickle-Degnen, L., Scheutz, M.: Challenges in designing a fully autonomous socially assistive robot for people with parkinson’s disease. ACM Trans. Human-Robot Interact. (THRI) 9(3), 1–31 (2020)

    Article  Google Scholar 

  11. Kory-Westlund, J.M., Breazeal, C.: A long-term study of young children’s rapport, social emulation, and language learning with a peer-like robot playmate in preschool. Front. Rob. AI 6, 81 (2019)

    Article  Google Scholar 

  12. Fasola, J., Matarić, M.J.: A socially assistive robot exercise coach for the elderly. J. Hum.-Robot Interact. 2(2), 3–32 (2013)

    Google Scholar 

  13. Fischinger, D.: Hobbit, a care robot supporting independent living at home: first prototype and lessons learned. Rob. Auton. Syst. 75, 60–78 (2016)

    Article  Google Scholar 

  14. Wilson, J.R., Wransky, R., Tierno, J.: General approach to automatically generating need-based assistance. In: Proceedings of the Sixth Annual Conference on Advances in Cognitive Systems (2018)

    Google Scholar 

  15. Kurylo, U., Wilson, J.R.: Using human eye gaze patterns as indicators of need for assistance from a socially assistive robot. In: Salichs, M.A., et al. (eds.) ICSR 2019. LNCS (LNAI), vol. 11876, pp. 200–210. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-35888-4_19

    Chapter  Google Scholar 

  16. Shao, M., Alves, S.F.D.R., Ismail, O., Zhang, X., Nejat, G., Benhabib, B.: You are doing great! only one rep left: an affect-aware social robot for exercising. In: 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), pp. 3811–3817. IEEE (2019)

    Google Scholar 

  17. Graterol, W., Diaz-Amado, J., Cardinale, Y., Dongo, I., Lopes-Silva, E., Santos-Libarino, C.: Emotion detection for social robots based on NLP transformers and an emotion ontology. Sensors 21(4), 1322 (2021)

    Article  Google Scholar 

  18. Nau, D., Cao, Y., Lotem, A., Munoz-Avila, H.: Shop: simple hierarchical ordered planner. In: Proceedings of the 16th International Joint Conference on Artificial Intelligence, vol. 2, pp. 968–973 (1999)

    Google Scholar 

  19. Bohus, D., et al.: Platform for situated intelligence (2021)

    Google Scholar 

  20. Baltrušaitis, T., Robinson, P., Morency, L.-P.: Openface: an open source facial behavior analysis toolkit. In: 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), pp. 1–10. IEEE, 2016

    Google Scholar 

  21. Csikszentmihalyi, M., Csikzentmihaly, M.: Flow: The Psychology of Optimal Experience, vol. 1990. Harper & Row New York (1990)

    Google Scholar 

  22. Rogers, J., Holm, M.: Performance assessment of self-care skills (pass-home) version 3.1. University of Pittsburgh, Pittsburgh (1994)

    Google Scholar 

Download references

Acknowledgment

We thank Ulyana Kurylo, Alex Reneau, Roxy Wilcox, Kevin Hou, and Anzu Hakone for their contributions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jason R. Wilson .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wilson, J.R., Aung, P.T., Boucher, I. (2022). When to Help? A Multimodal Architecture for Recognizing When a User Needs Help from a Social Robot. In: Cavallo, F., et al. Social Robotics. ICSR 2022. Lecture Notes in Computer Science(), vol 13817. Springer, Cham. https://doi.org/10.1007/978-3-031-24667-8_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-24667-8_23

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-24666-1

  • Online ISBN: 978-3-031-24667-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics