skip to main content
10.1145/3379157.3391655acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Analyzing Transferability of Happiness Detection via Gaze Tracking in Multimedia Applications

Published: 02 June 2020 Publication History

Abstract

How are strong positive affective states related to eye-tracking features and how can they be used to appropriately enhance well-being in multimedia consumption? In this paper, we propose a robust classification algorithm for predicting strong happy emotions from a large set of features acquired from wearable eye-tracking glasses. We evaluate the potential transferability across subjects and provide a model-agnostic interpretable feature importance metric. Our proposed algorithm achieves a true-positive-rate of 70% while keeping a low false-positive-rate of 10% with extracted features of the pupil diameter as most important features.

References

[1]
Margaret M Bradley, Laura Miccoli, Miguel A Escrig, and Peter J Lang. 2008. The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45, 4 (2008), 602–607.
[2]
Leo Breiman. 2001. Random forests. Machine learning 45, 1 (2001), 5–32.
[3]
Lucio Ciabattoni, Francesco Ferracuti, Sauro Longhi, Lucia Pepa, Luca Romeo, and Federica Verdini. 2017. Multimedia experience enhancement through affective computing. In 2017 IEEE International Conference on Consumer Electronics (ICCE). IEEE, 182–183.
[4]
Paul Ed Ekman and Richard J Davidson. 1994. The nature of emotion: Fundamental questions.Oxford University Press.
[5]
Barbara L Fredrickson and Thomas Joiner. 2002. Positive emotions trigger upward spirals toward emotional well-being. Psychological science 13, 2 (2002), 172–175.
[6]
Michael Hurst, Thomas W Jackson, and Mashhuda Glencross. 2012. Emotion recognition—Theory or practicality. In 18th International Conference on Automation and Computing (ICAC). IEEE, 1–6.
[7]
Rob Jacob and Sophie Stellmach. 2016. What you look at is what you get: gaze-based user interfaces. interactions 23, 5 (2016), 62–65.
[8]
Hideo Joho, Jacopo Staiano, Nicu Sebe, and Joemon M Jose. 2011. Looking at the viewer: analysing facial activity to detect personal highlights of multimedia contents. Multimedia Tools and Applications 51, 2 (2011), 505–523.
[9]
Harish Katti and Mohan Kankanhalli. 2011. Eye-tracking methodology and applications to images and video. In Proceedings of the 19th ACM international conference on Multimedia. 641–642.
[10]
Kyle Krafka, Aditya Khosla, Petr Kellnhofer, Harini Kannan, Suchendra Bhandarkar, Wojciech Matusik, and Antonio Torralba. 2016. Eye tracking for everyone. In Proceedings of the IEEE conference on computer vision and pattern recognition. 2176–2184.
[11]
He Li, Yi-Ming Jin, Wei-Long Zheng, and Bao-Liang Lu. 2018. Cross-subject emotion recognition using deep adaptation networks. In International Conference on Neural Information Processing. Springer, 403–413.
[12]
Wei Liu, Jie-Lin Qiu, Wei-Long Zheng, and Bao-Liang Lu. 2019. Multimodal Emotion Recognition Using Deep Canonical Correlation Analysis. arXiv preprint arXiv:1908.05349(2019).
[13]
Jan L Plass, Steffi Heidig, Elizabeth O Hayward, Bruce D Homer, and Enjoon Um. 2014. Emotional design in multimedia learning: Effects of shape and color on affect and learning. Learning and Instruction 29 (2014), 128–140.
[14]
David C Rubin and Jennifer M Talarico. 2009. A comparison of dimensional models of emotion: Evidence from emotions, prototypical events, autobiographical memories, and words. Memory 17, 8 (2009), 802–808.
[15]
Shuai Zhang, Sally I McClean, Aygul Garifullina, Ian Kegel, Gaye Lightbody, Michael Milliken, Andrew Ennis, and Bryan Scotney. 2018. Evaluation of the TV customer experience using eye tracking technology. In British HCI Conference 2018.
[16]
W. Zheng, W. Liu, Y. Lu, B. Lu, and A. Cichocki. 2018. EmotionMeter: A Multimodal Framework for Recognizing Human Emotions. IEEE Transactions on Cybernetics(2018), 1–13. https://doi.org/10.1109/TCYB.2018.2797176

Cited By

View all
  • (2024)Cognitive state detection with eye tracking in the field: an experience sampling study and its lessons learnedi-com10.1515/icom-2023-003523:1(109-129)Online publication date: 15-Apr-2024
  • (2024)Emotion recognition and artificial intelligenceInformation Fusion10.1016/j.inffus.2023.102019102:COnline publication date: 1-Feb-2024
  • (2023)Technical Design Space Analysis for Unobtrusive Driver Emotion Assessment Using Multi-Domain ContextProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35694666:4(1-30)Online publication date: 11-Jan-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '20 Adjunct: ACM Symposium on Eye Tracking Research and Applications
June 2020
200 pages
ISBN:9781450371353
DOI:10.1145/3379157
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 June 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. affect detection
  2. eye-tracking
  3. multimedia data analysis

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Funding Sources

  • German Research Foundation (DFG)

Conference

ETRA '20

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

ETRA '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)10
  • Downloads (Last 6 weeks)2
Reflects downloads up to 08 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Cognitive state detection with eye tracking in the field: an experience sampling study and its lessons learnedi-com10.1515/icom-2023-003523:1(109-129)Online publication date: 15-Apr-2024
  • (2024)Emotion recognition and artificial intelligenceInformation Fusion10.1016/j.inffus.2023.102019102:COnline publication date: 1-Feb-2024
  • (2023)Technical Design Space Analysis for Unobtrusive Driver Emotion Assessment Using Multi-Domain ContextProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35694666:4(1-30)Online publication date: 11-Jan-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media