skip to main content
10.1145/3652988.3673963acmconferencesArticle/Chapter ViewAbstractPublication PagesivaConference Proceedingsconference-collections
extended-abstract

A Virtual Agent as a Commensal Companion

Published: 26 December 2024 Publication History

Abstract

Previous work introduced the concept of artificial commensal companions, i.e., embodied agents capable of interacting with humans during meals. They are supposed to bring the benefits of eating together in settings where a human would be forced to eat alone (e.g., elderly, hospitalized patients, self-isolation, etc.). This paper presents an experiment with a virtual agent and a human eating together. We invited volunteers to bring a small meal and let them chat briefly with the agent, simulating eating behaviors during the conversation. After the experience, participants filled out a questionnaire, providing quantitative and qualitative feedback. While results are encouraging (i.e., participants showed interest in eating with an agent), further work is still needed to provide more convincing results.

References

[1]
E. Bevacqua, S. Pammi, S. J. Hyniewska, M. Schröder, and C. Pelachaud. 2010. Multimodal Backchannels for Embodied Conversational Agents. In Intelligent Virtual Agents, J. Allbeck, N. Badler, T. Bickmore, C. Pelachaud, and A. Safonova (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 194–200.
[2]
Cigdem Beyan, Muhammad Shahid, and Vittorio Murino. 2021. RealVAD: A Real-World Dataset and A Method for Voice Activity Detection by Body Motion Analysis. IEEE Transactions on Multimedia 23 (2021), 2071–2085. https://doi.org/10.1109/TMM.2020.3007350
[3]
Yin Bi, Wenyao Xu, Nan Guan, Yangjie Wei, and Wang Yi. 2014. Pervasive Eating Habits Monitoring and Recognition Through a Wearable Acoustic Sensor. In Proceedings of the 8th International Conference on Pervasive Computing Technologies for Healthcare (Oldenburg, Germany) (PervasiveHealth ’14). ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), ICST, Brussels, Belgium, Belgium, 174–177. https://doi.org/10.4108/icst.pervasivehealth.2014.255423
[4]
Steven Cadavid and Mohamed Abdel-Mottaleb. 2010. Exploiting Visual Quasi-periodicity for Automated Chewing Event Detection Using Active Appearance Models and Support Vector Machines. In 2010 20th International Conference on Pattern Recognition. 1714–1717.
[5]
Eleonora Ceccaldi, Radoslaw Niewiadomski, Maurizio Mancini, and Gualtiero Volpe. 2022. What’s on your plate? Collecting multimodal data to understand commensal behavior. Frontiers in Psychology 13 (2022). https://doi.org/10.3389/fpsyg.2022.911000
[6]
J. M. Fontana, M. Farooq, and E. Sazonov. 2014. Automatic Ingestion Monitor: A Novel Wearable Device for Monitoring of Ingestive Behavior. IEEE Transactions on Biomedical Engineering 61, 6 (June 2014), 1772–1779.
[7]
Ayaka Fujii, Kanae Kochigami, Shingo Kitagawa, Kei Okada, and Masayuki Inaba. 2020. Development and Evaluation of Mixed Reality Co-eating System: Sharing the Behavior of Eating Food with a Robot Could Improve Our Dining Experience. In 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). 357–362. https://doi.org/10.1109/RO-MAN47096.2020.9223518
[8]
Paula M Gardiner, Kelly D McCue, Lily M Negash, Teresa Cheng, Laura F White, Leanne Yinusa-Nyahkoon, Brian W Jack, and Timothy W Bickmore. 2017. Engaging women with an embodied conversational agent to deliver mindfulness and lifestyle recommendations: A feasibility randomized control trial. Patient education and counseling 100, 9 (2017), 1720–1729.
[9]
Delwar Hossain, Tonmoy Ghosh, and Edward Sazonov. 2020. Automatic count of bites and chews from videos of eating episodes. IEEE Access 8 (2020), 101934–101945.
[10]
Yugyeong Jung, Gyuwon Jung, Sooyeon Jeong, Chaewon Kim, Woontack Woo, Hwajung Hong, and Uichin Lee. 2023. "Enjoy, but Moderately!": Designing a Social Companion Robot for Social Engagement and Behavior Moderation in Solitary Drinking Context. Proc. ACM Hum.-Comput. Interact. 7, CSCW2, Article 237 (oct 2023), 24 pages. https://doi.org/10.1145/3610028
[11]
Azusa Kadomura, Cheng-Yuan Li, Yen-Chang Chen, Koji Tsukada, Itiro Siio, and Hao-hua Chu. 2013. Sensing fork: eating behavior detection utensil and mobile persuasive game. In CHI’13 Extended Abstracts on Human Factors in Computing Systems. ACM, 1551–1556.
[12]
R. A. Khot, E. S. Arza, H. Kurra, and Y. Wang. 2019. FoBo: Towards Designing a Robotic Companion for Solo Dining. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI EA ’19). ACM, New York, NY, USA, LBW1617:1–LBW1617:6.
[13]
R. Liu and T. Inoue. 2014. Application of an Anthropomorphic Dining Agent to Idea Generation. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication(UbiComp ’14 Adjunct). ACM, New York, NY, USA, 607–612.
[14]
Maurizio Mancini, Radoslaw Niewiadomski, Gabriele De Lucia, and Francesco M. Longobardi. 2024. A Virtual Agent as a Commensal Companion - Questionnaire. https://doi.org/10.17605/OSF.IO/FJ65Z
[15]
[15] MediaPipe. [n. d.]. https://developers.google.com/mediapipe/solutions. Accessed: 2024-06-22.
[16]
[16] MetaHuman. [n. d.]. https://www.unrealengine.com/marketplace/en-US/product/metahuman-plugin. Accessed: 2024-06-22.
[17]
Radoslaw Niewiadomski, Merijn Bruijnes, Gijs Huisman, Conor Patrick Gallagher, and Maurizio Mancini. 2022. Social robots as eating companions. Frontiers in Computer Science 4 (2022). https://doi.org/10.3389/fcomp.2022.909844
[18]
R. Niewiadomski, E. Ceccaldi, G. Huisman, G. Volpe, and M. Mancini. 2019. Computational Commensality: from theories to computational models for social food preparation and consumption in HCI. Frontiers in Robotics and AI 6 (2019), 1–19.
[19]
Radoslaw Niewiadomski, Gabriele De Lucia, Gabriele Grazzi, and Maurizio Mancini. 2022. Towards Commensal Activities Recognition. In Proceedings of the 2022 International Conference on Multimodal Interaction (Bengaluru, India) (ICMI ’22). Association for Computing Machinery, New York, NY, USA, 549–557. https://doi.org/10.1145/3536221.3556566
[20]
Daehyung Park, Yuuna Hoshi, Harshal P. Mahajan, Ho Keun Kim, Zackory Erickson, Wendy A. Rogers, and Charles C. Kemp. 2020. Active robot-assisted feeding with a general-purpose mobile manipulator: Design, evaluation, and lessons learned. Robotics and Autonomous Systems 124 (2020), 103344. https://doi.org/10.1016/j.robot.2019.103344
[21]
Maína Ribeiro Pereira-Castro, Adriano Gomes Pinto, Tamila Raposo Caixeta, Renata Alves Monteiro, Ximena Pamela Díaz Bermúdez, and Ana Valéria Machado Mendonça. 2022. Digital Forms of Commensality in the 21st Century: A Scoping Review. International Journal of Environmental Research and Public Health 19, 24 (2022). https://doi.org/10.3390/ijerph192416734
[22]
Philipp V. Rouast and Marc T. P. Adam. 2020. Learning Deep Representations for Video-Based Intake Gesture Detection. IEEE Journal of Biomedical and Health Informatics 24, 6 (2020), 1727–1737.
[23]
Sainsbury. 2018. The Sainsbury’s Living Well Index. https://www.about.sainsburys.co.uk/ /media/Files/S/Sainsburys/living-well-index/sainsburys-living-well-index-may-2018.pdf. Accessed January 26, 2022.
[24]
C. Spence, M. Mancini, and G. Huisman. 2019. Digital commensality: Eating and drinking in the company of technology. Frontiers in psychology 10 (2019), 2252.
[25]
M. Takahashi, H. Tanaka, H. Yamana, and T. Nakajima. 2017. Virtual Co-Eating: Making Solitary Eating Experience More Enjoyable. In Entertainment Computing – ICEC 2017, N. Munekata, I. Kunita, and J. Hoshino (Eds.). Springer International Publishing, Cham, 460–464.
[26]
[26] Unreal. [n. d.]. https://www.unrealengine.com. Accessed: 2024-06-22.
[27]
Hannes Vilhjálmsson, Nathan Cantelmo, Justine Cassell, Nicolas E. Chafai, Michael Kipp, Stefan Kopp, Maurizio Mancini, Stacy Marsella, Andrew N Marshall, Catherine Pelachaud, 2007. The behavior markup language: Recent developments and challenges. In Intelligent Virtual Agents: 7th International Conference, IVA 2007 Paris, France, September 17-19, 2007 Proceedings 7. Springer, 99–111.

Index Terms

  1. A Virtual Agent as a Commensal Companion

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    IVA '24: Proceedings of the 24th ACM International Conference on Intelligent Virtual Agents
    September 2024
    337 pages
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 December 2024

    Check for updates

    Author Tags

    1. activity recognition
    2. commensality
    3. companion
    4. interaction design
    5. virtual agent

    Qualifiers

    • Extended-abstract
    • Research
    • Refereed limited

    Funding Sources

    Conference

    IVA '24
    Sponsor:
    IVA '24: ACM International Conference on Intelligent Virtual Agents
    September 16 - 19, 2024
    GLASGOW, United Kingdom

    Acceptance Rates

    Overall Acceptance Rate 53 of 196 submissions, 27%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 50
      Total Downloads
    • Downloads (Last 12 months)50
    • Downloads (Last 6 weeks)17
    Reflects downloads up to 08 Mar 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media