skip to main content
10.1145/3136755.3136814acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

UE-HRI: a new dataset for the study of user engagement in spontaneous human-robot interactions

Published: 03 November 2017 Publication History

Abstract

In this paper, we present a new dataset of spontaneous interactions between a robot and humans, of which 54 interactions (between 4 and 15-minute duration each) are freely available for download and use. Participants were recorded while holding spontaneous conversations with the robot Pepper. The conversations started automatically when the robot detected the presence of a participant and kept the recording if he/she accepted the agreement (i.e. to be recorded). Pepper was in a public space where the participants were free to start and end the interaction when they wished. The dataset provides rich streams of data that could be used by research and development groups in a variety of areas.

References

[1]
Jeremy Ang, Rajdip Dhillon, Ashley Krupski, Elizabeth Shriberg, and Andreas Stolcke. 2002. Prosody-Based Automatic Detection Of Annoyance And Frustration In Human-Computer Dialog. In in Proc. ICSLP 2002. 2037–2040.
[2]
Salvatore M. Anzalone, Sofiane Boucenna, Serena Ivaldi, and Mohamed Chetouani. 2015. Evaluating the Engagement with Social Robots. International Journal of Social Robotics 7, 4 (aug 2015), 465–478.
[3]
ROGER G. BARKER. 1938. The Effect of Frustration upon the Cognitive Ability. Journal of Personality 7, 2 (dec 1938), 145–150. 1938.tb02284.x
[4]
Dan Bohus and Eric Horvitz. 2014. Managing Human-Robot Engagement with Forecasts and... um... Hesitations. In Proceedings of the 16th International Conference on Multimodal Interaction - ICMI ’14. ACM Press, New York, New York, USA, 2–9.
[5]
Hynek Boril, Seyed Omid Sadjadi, Tristan Kleinschmidt, and John H L Hansen. 2010. Analysis and detection of cognitive load and frustration in drivers’ speech. In Proceedings of {INTERSPEECH}. Makuhari Messe International Convention Complex, Chiba, Makuhari, Japan, 502–505.
[6]
Nigel Bosch and Sidney D’Mello. 2015. The Affective Experience of Novice Computer Programmers. International Journal of Artificial Intelligence in Education (oct 2015), 1–26.
[7]
P T Costa, R R McCrae, and Inc Psychological Assessment Resources. 1992. Revised NEO Personality Inventory (NEO PI-R) and NEO Five-Factor Inventory (NEO-FFI). Psychological Assessment Resources. https://books.google.co.in/ books?id=mp3zNwAACAAJ
[8]
John W Creswell. 2013. Research design: Qualitative, quantitative, and mixed methods approaches. Sage publications.
[9]
Sidney D’Mello and Art Graesser. 2012. Dynamics of affective states during complex learning. Learning and Instruction 22, 2 (2012), 145–157. UE-HRI: A New Dataset for the Study of User Engagement in Spontaneous ... ICMI’17, November 13–17, 2017, Glasgow, UK
[10]
P. Dybala, M. Ptaszynski, R. Rzepka, and K. Araki. 2009. Activating humans with humor - A dialogue system that users want to interact with. IEICE Transactions on Information and Systems E92-D (2009), 2394–2401. Issue 12.
[11]
Lynne Hall, Sarah Woods, Ruth Aylett, Lynne Newall, and Ana Paiva. 2005. Achieving empathic engagement through affective interaction with synthetic characters. In Affective computing and intelligent interaction. Springer, 731–738.
[12]
Ryuichiro Higashinaka, Kotaro Funakoshi, Yuka Kobayashi, and Michimasa Inaba. 2016. The dialogue breakdown detection challenge: Task description, datasets, and evaluation metrics. In Proceedings of the Tenth International Conference on Language Resources and Evaluation (LREC 2016). European Language Resources Association (ELRA), Paris, France.
[13]
Serena Ivaldi, Sebastien Lefort, Jan Peters, Mohamed Chetouani, Joelle Provasi, and Elisabetta Zibetti. 2017. Towards Engagement Models that Consider Individual Factors in HRI: On the Relation of Extroversion and Negative Attitude Towards Robots to Gaze and Speech During a HumanâĂŞRobot Assembly Task. International Journal of Social Robotics 9, 1 (jan 2017), 63–86.
[14]
Iolanda Leite, Marissa McCoy, Daniel Ullman, Nicole Salomons, and Brian Scassellati. 2015. Comparing Models of Disengagement in Individual and Group Interactions. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction - HRI ’15. ACM Press, New York, New York, USA, 99–105.
[15]
Marwa Mahmoud, Tadas Baltrušaitis, Peter Robinson, and Laurel Riek. 2011. 3D corpus of spontaneous complex mental states. In Conference on Affective Computing and Intelligent Interaction.
[16]
Gary McKeown, Michel F. Valstar, Roderick Cowie, and Maja Pantic. 2010. The SEMAINE corpus of emotionally coloured character interactions. In 2010 IEEE International Conference on Multimedia and Expo. IEEE, 1079–1084.
[17]
org/10.1109/ICME.2010.5583006
[18]
Tatsuya Nomura, Takayuki Kanda, and Tomohiro Suzuki. Experimental investigation into influence of negative attitudes toward robots on humanâĂŞrobot interaction. (????).
[19]
Christopher Peters, Ginevra Castellano, and Sara de Freitas. 2009. An exploration of user engagement in HCI. In Proceedings of the International Workshop on Affective-Aware Virtual Agents and Social Robots. ACM, 9.
[20]
I Poggi. 2007. Mind, Hands, Face and Body: A Goal and Belief View of Multimodal Communication. Weidler Buchverlag Berlin.
[21]
Charles Rich, Brett Ponsler, Aaron Holroyd, and Candace L. Sidner. 2010. Recognizing engagement in human-robot interaction. In 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 375–382.
[22]
KR Scherer. 1994. Affect bursts. Emotions: Essays on emotion theory (1994). https://books.google.fr/books?hl=fr
[23]
Marc Schröder. 2003. Experimental study of affect bursts. Speech Communication 40, 1 (2003), 99–116.
[24]
Candace L. Sidner, Christopher Lee, Cory D. Kidd, Neal Lesh, and Charles Rich. 2005. Explorations in engagement for humans and robots. Artificial Intelligence 166, 1-2 (aug 2005), 140–164.
[25]
Dominique Vaufreydaz, Wafa Johal, and Claudine Combe. 2016. Starting engagement detection towards a companion robot using multimodal features. Robotics and Autonomous Systems 75 (jan 2016), 4–16. 01.004
[26]
Johannes Wienke, David Klotz, and Sebastian Wrede. 2012. A Framework for the Acquisition of Multimodal Human-Robot Interaction Data Sets with a Whole-System Perspective. LREC 2012 Workshop on Multimodal Corpora for Machine Learning (2012).

Cited By

View all
  • (2025)Multimodal Engagement Prediction in Human-Robot Interaction Using Transformer Neural NetworksMultiMedia Modeling10.1007/978-981-96-2074-6_1(3-17)Online publication date: 1-Jan-2025
  • (2024)Sensors, Techniques, and Future Trends of Human-Engagement-Enabled Applications: A ReviewAlgorithms10.3390/a1712056017:12(560)Online publication date: 6-Dec-2024
  • (2024)SEMPI: A Database for Understanding Social Engagement in Video-Mediated Multiparty InteractionProceedings of the 26th International Conference on Multimodal Interaction10.1145/3678957.3685752(546-555)Online publication date: 4-Nov-2024
  • Show More Cited By

Index Terms

  1. UE-HRI: a new dataset for the study of user engagement in spontaneous human-robot interactions

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ICMI '17: Proceedings of the 19th ACM International Conference on Multimodal Interaction
    November 2017
    676 pages
    ISBN:9781450355438
    DOI:10.1145/3136755
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 03 November 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. HRI dataset
    2. in-the-wild
    3. user engagement

    Qualifiers

    • Research-article

    Conference

    ICMI '17
    Sponsor:

    Acceptance Rates

    ICMI '17 Paper Acceptance Rate 65 of 149 submissions, 44%;
    Overall Acceptance Rate 453 of 1,080 submissions, 42%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)297
    • Downloads (Last 6 weeks)40
    Reflects downloads up to 17 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)Multimodal Engagement Prediction in Human-Robot Interaction Using Transformer Neural NetworksMultiMedia Modeling10.1007/978-981-96-2074-6_1(3-17)Online publication date: 1-Jan-2025
    • (2024)Sensors, Techniques, and Future Trends of Human-Engagement-Enabled Applications: A ReviewAlgorithms10.3390/a1712056017:12(560)Online publication date: 6-Dec-2024
    • (2024)SEMPI: A Database for Understanding Social Engagement in Video-Mediated Multiparty InteractionProceedings of the 26th International Conference on Multimodal Interaction10.1145/3678957.3685752(546-555)Online publication date: 4-Nov-2024
    • (2024)Predicting Human Intent to Interact with a Public Robot: The People Approaching Robots Database (PAR-D)Proceedings of the 26th International Conference on Multimodal Interaction10.1145/3678957.3685706(536-545)Online publication date: 4-Nov-2024
    • (2024)Goes to the Heart: Speaking the User's Native LanguageCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640633(214-218)Online publication date: 11-Mar-2024
    • (2024)A Rosbag Tool to Improve Dataset ReliabilityCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640556(1085-1089)Online publication date: 11-Mar-2024
    • (2024)RW4T Dataset: Data of Human-Robot Behavior and Cognitive States in Simulated Disaster Response TasksProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3637481(924-928)Online publication date: 11-Mar-2024
    • (2024)REACT: Two Datasets for Analyzing Both Human Reactions and Evaluative Feedback to Robots Over TimeProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3637480(885-889)Online publication date: 11-Mar-2024
    • (2024)PoseTron: Enabling Close-Proximity Human-Robot Collaboration Through Multi-human Motion PredictionProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3635006(830-839)Online publication date: 11-Mar-2024
    • (2024)Automatic Context-Aware Inference of Engagement in HMI: A SurveyIEEE Transactions on Affective Computing10.1109/TAFFC.2023.327870715:2(445-464)Online publication date: Apr-2024
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media