skip to main content
10.1145/3386392.3399289acmconferencesArticle/Chapter ViewAbstractPublication PagesumapConference Proceedingsconference-collections
research-article

Confusion Detection Dataset of Mouse and Eye Movements

Published: 13 July 2020 Publication History

Abstract

Since emotion detection mostly employs supervised machine learning, big labeled datasets are needed to train accurate detectors. Currently, there is a lack of the open datasets, especially in the domain of confusion detection on the web. In this paper, we introduce a confusion detection dataset comprising of two modalities - the mouse movements and the eye movements of the users. The dataset was gathered during a quantitative controlled user study with 60 participants. We chose a travel agency web application for the study, where we carefully designed six tasks reflecting the common behavior and the problems of the day-to-day users. In the paper, we also discuss the issue of labeling emotional data during the study and provide exploratory analysis of the dataset and insights into the confused users' behavior.

Supplementary Material

VTT File (3386392.3399289.vtt)
MP4 File (3386392.3399289.mp4)
Supplemental Video

References

[1]
Kiavash Bahreini, Rob Nadolski, and Wim Westera. 2016. Towards multimodal emotion recognition in e-learning environments. Interactive Learning Environments, Vol. 24, 3 (2016), 590--605. https://doi.org/10.1080/10494820.2014.908927
[2]
Maria Bielikova, Martin Konopka, Jakub Simko, Robert Moro, Jozef Tvarozek, Patrik Hlavac, and Eduard Kuric. 2018. Eye-tracking en masse: Group user studies, lab infrastructure, and practices. Journal of Eye Movement Research, Vol. 11, 3 (Aug. 2018). https://doi.org/10.16910/jemr.11.3.6
[3]
Margaret M Bradley and Peter J Lang. 1994. Measuring emotion: the self-assessment manikin and the semantic differential. Journal of behavior therapy and experimental psychiatry, Vol. 25, 1 (1994), 49--59.
[4]
Carlos Busso, Murtaza Bulut, Chi-Chun Lee, Abe Kazemzadeh, Emily Mower, Samuel Kim, Jeannette N Chang, Sungbok Lee, and Shrikanth S Narayanan. 2008. IEMOCAP: Interactive emotional dyadic motion capture database. Language resources and evaluation, Vol. 42, 4 (2008), 335.
[5]
Mon Chu Chen, John R Anderson, and Myeong Ho Sohn. 2001. What can a mouse cursor tell us more?: correlation of eye/mouse movements on web browsing. In CHI'01 extended abstracts on Human factors in computing systems. ACM, 281--282.
[6]
Cristina Conati, Enamul Hoque, Dereck Toker, and Ben Steichen. 2013. When to Adapt: Detecting User's Confusion During Visualization Processing. In UMAP '13 Extended Proceedings .
[7]
Sidney K D'Mello, Scotty D Craig, Jeremiah Sullins, and Arthur C Graesser. 2006. Predicting affective states expressed through an emote-aloud procedure from AutoTutor's mixed-initiative dialogue. International Journal of Artificial Intelligence in Education, Vol. 16, 1 (2006), 3--28.
[8]
Ellen Douglas-Cowie, Roddy Cowie, and Marc Schröder. 2000. A new emotion database: considerations, sources and scope. In ISCA tutorial and research workshop (ITRW) on speech and emotion .
[9]
Ellen Douglas-Cowie, Roddy Cowie, Ian Sneddon, Cate Cox, Orla Lowry, Margaret Mcrorie, Jean-Claude Martin, Laurence Devillers, Sarkis Abrilian, Anton Batliner, et al. 2007. The HUMAINE database: Addressing the collection and annotation of naturalistic and induced emotional data. In International conference on affective computing and intelligent interaction. Springer, 488--500.
[10]
Andrew T Duchowski. 2007. Eye tracking methodology. Theory and practice, Vol. 328 (2007).
[11]
Allen L Edwards. 1957. The social desirability variable in personality assessment and research. (1957).
[12]
Michal Hucko, Ladislav Gazo, Peter Simun, Matej Valky, Robert Moro, Jakub Simko, and Maria Bielikova. 2019. YesElf: Personalized Onboarding for Web Applications. In Adjunct Publication of the 27th Conference on User Modeling, Adaptation and Personalization (UMAP'19 Adjunct). Association for Computing Machinery, New York, NY, USA, 39--44. https://doi.org/10.1145/3314183.3324978
[13]
Michal Hucko, Robert Moro, and Maria Bielikova. 2020. Scalable Real-Time Confusion Detection for Personalized Onboarding Guides. In Proceedings of the 20th Int. Conf. on Web Engineering - ICWE'20. Springer, to appear.
[14]
Teoh Kung-Keat and Jasmine Ng. 2016. Confused, bored, excited? An emotion based approach to the design of online learning systems. In 7th Int. Conf. on University Learning and Teaching (InCULT 2014) Proceedings. Springer, 221--233.
[15]
Sébastien Lallé, Cristina Conati, and Giuseppe Carenini. 2016. Predicting Confusion in Information Visualization from Eye Tracking and Interaction Data. In IJCAI. 2529--2535.
[16]
Fu-Ren Lin and Chien-Min Kao. 2018. Mental effort detection using EEG data in E-learning contexts. Computers & Education, Vol. 122 (2018), 63 -- 79. https://doi.org/10.1016/j.compedu.2018.03.020
[17]
Olivier Martin, Irene Kotsia, Benoit Macq, and Ioannis Pitas. 2006. The eNTERFACE'05 audio-visual emotion database. In 22nd International Conference on Data Engineering Workshops (ICDEW'06). IEEE, 8--8.
[18]
Gary McKeown, Michel Valstar, Roddy Cowie, Maja Pantic, and Marc Schroder. 2011. The semaine database: Annotated multimodal records of emotionally colored conversations between a person and a limited agent. IEEE transactions on affective computing, Vol. 3, 1 (2011), 5--17.
[19]
Paula M Niedenthal. 2007. Embodying emotion. science, Vol. 316, 5827 (2007), 1002--1005.
[20]
C. M. Paxiuba, J. Calado, C. P. Lima, and J. Sarraipa. 2018. CADAP: A student's emotion monitoring solution for e-learning performance analysis. In 2018 International Conference on Intelligent Systems (IS). 776--783. https://doi.org/10.1109/IS.2018.8710542
[21]
Avar Pentel. 2015a. Employing think-aloud protocol to connect user emotions and mouse movements. In 2015 6th International Conference on Information, Intelligence, Systems and Applications (IISA). IEEE, 1--5.
[22]
Avar Pentel. 2015b. Patterns of Confusion: Using Mouse Logs to Predict User's Emotional State. In UMAP Workshops .
[23]
Verónica Pérez-Rosas, Rada Mihalcea, and Louis-Philippe Morency. 2013. Utterance-level multimodal sentiment analysis. In Proc. of the 51st Annual Meeting of the Assoc. for Computational Linguistics. 973--982.
[24]
Rosalind W Picard. 2000. Affective computing. MIT press.
[25]
Soujanya Poria, Erik Cambria, Rajiv Bajpai, and Amir Hussain. 2017. A review of affective computing: From unimodal analysis to multimodal fusion. Information Fusion, Vol. 37 (2017), 98--125.
[26]
Olga C. Santos. 2016. Emotions and Personality in Adaptive e-Learning Systems: An Affective Computing Perspective .Springer Int. Publishing, Cham, 263--285. https://doi.org/10.1007/978--3--319--31413--6_13
[27]
Philip Schmidt, Robert Dürichen, Attila Reiss, Kristof Van Laerhoven, and Thomas Plötz. 2019. Multi-target affect detection in the wild: an exploratory study. In Proc. of the 23rd Int. Symposium on Wearable Computers. ACM, 211--219.
[28]
Philip Schmidt, Attila Reiss, Robert Duerichen, Claus Marberger, and Kristof Van Laerhoven. 2018. Introducing WESAD, a Multimodal Dataset for Wearable Stress and Affect Detection. In Proc. of the 2018 on Int. Conf. on Multimodal Interaction. ACM, 400--408.
[29]
Georgios Tsoulouhas, Dimitrios Georgiou, and Alexandros Karakos. 2011. Detection of learner's affective state based on mouse movements. Journal of Computing, Vol. 3, 11 (2011), 9--18.
[30]
M. Wöllmer, F. Weninger, T. Knaup, B. Schuller, C. Sun, K. Sagae, and L. Morency. 2013. YouTube Movie Reviews: Sentiment Analysis in an Audio-Visual Context. IEEE Intelligent Systems, Vol. 28, 3 (May 2013), 46--53. https://doi.org/10.1109/MIS.2013.34
[31]
Philippe Zimmermann, Sissel Guttormsen, Brigitta Danuser, and Patrick Gomez. 2003. Affective computing-a rationale for measuring mood with mouse and keyboard. Int. journal of occupational safety and ergonomics, Vol. 9, 4 (2003), 539--551.

Cited By

View all
  • (2024)Gaze-based detection of mind wandering during audio-guided panorama viewingScientific Reports10.1038/s41598-024-79172-x14:1Online publication date: 14-Nov-2024
  • (2022)Towards Inclusive HRI: Using Sim2Real to Address Underrepresentation in Emotion Expression Recognition2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)10.1109/IROS47612.2022.9982252(9132-9139)Online publication date: 23-Oct-2022
  • (2022)Visual-based Confusion Detection using a Cooperative Spatio-Temporal Deep Neural Networks2022 International Conference on Digital Government Technology and Innovation (DGTi-CON)10.1109/DGTi-CON53875.2022.9849192(80-85)Online publication date: 24-Mar-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
UMAP '20 Adjunct: Adjunct Publication of the 28th ACM Conference on User Modeling, Adaptation and Personalization
July 2020
395 pages
ISBN:9781450379502
DOI:10.1145/3386392
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 July 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. affective computing
  2. confusion detection
  3. emotions labeling
  4. eye movements data
  5. labeled dataset
  6. mouse movements data

Qualifiers

  • Research-article

Funding Sources

  • Vedecká Grantová Agentúra MðVVað SR a SAV
  • Agentúra na Podporu Vðskumu a Vðvoja

Conference

UMAP '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 162 of 633 submissions, 26%

Upcoming Conference

UMAP '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)63
  • Downloads (Last 6 weeks)4
Reflects downloads up to 05 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Gaze-based detection of mind wandering during audio-guided panorama viewingScientific Reports10.1038/s41598-024-79172-x14:1Online publication date: 14-Nov-2024
  • (2022)Towards Inclusive HRI: Using Sim2Real to Address Underrepresentation in Emotion Expression Recognition2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)10.1109/IROS47612.2022.9982252(9132-9139)Online publication date: 23-Oct-2022
  • (2022)Visual-based Confusion Detection using a Cooperative Spatio-Temporal Deep Neural Networks2022 International Conference on Digital Government Technology and Innovation (DGTi-CON)10.1109/DGTi-CON53875.2022.9849192(80-85)Online publication date: 24-Mar-2022
  • (2021)A Review of an Invasive and Non-invasive Automatic Confusion Detection TechniquesIOP Conference Series: Materials Science and Engineering10.1088/1757-899X/1105/1/0120261105:1(012026)Online publication date: 1-Jun-2021

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media