skip to main content
10.1145/3152832.3152836acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmumConference Proceedingsconference-collections
short-paper

DiVA: exploring the usage of pupil <u>di</u>ameter to elicit <u>v</u>alence and <u>a</u>rousal

Published:26 November 2017Publication History

ABSTRACT

Most of the typical digital systems are not fully aware of the users' affect states. Adapting to the users' state showed great potential for enhancing user experiences. However, most approaches for sensing affective states, specifically arousal and valence, involve expensive and obtrusive technologies, such as physiological sensors attached to users' bodies. This paper present an indicator of the users' affect based on eye tracking. We use a commercial eye tracker to monitor the user's pupil size to estimate their arousal and valence in response to videos of different content. To assess the effect of different content (namely pleasant and unpleasant) influencing the arousal and valence on the pupil diameter, we conducted a user study with 25 participants. The study showed that different content of videos affect the pupil diameter, thereby giving an indicator about the user's state. We provide empirical evidence showing how to unobtrusively detect changes in users' state. Our initial investigation gives rise to eye-based user's tracking, which introduces the potential of new applications in the field of affect-aware computing.

References

  1. Yomna Abdelrahman, Mariam Hassib, Maria Guinea Marquez, Markus Funk, and Albrecht Schmidt. 2015. Implicit engagement detection for interactive museums using brain-computer interfaces. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct. ACM, 838--845. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Yomna Abdelrahman, Eduardo Velloso, Tilman Dingier, Albrecht Schmidt, and Frank Vetere. 2017. Cognitive Heat: Exploring the Usage of Thermal Imaging to Unobtrusively Estimate Cognitive Load. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 33 (2017), 20. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Claudio Aracena, Sebastián Basterrech, Václav Snáel, and Juan Velásquez. 2015. Neural Networks for Emotion Recognition Based on Eye Tracking Data. In Systems, Man, and Cybernetics (SMC), 2015 IEEE International Conference on. IEEE, 2632--2637.Google ScholarGoogle Scholar
  4. Areej Babiker, Ibrahima Faye, and Aamir Malik. 2013. Pupillary behavior in positive and negative emotions. In Signal and Image Processing Applications (ICSIPA), 2013 IEEE International Conference on. IEEE, 379--383.Google ScholarGoogle ScholarCross RefCross Ref
  5. Areej Babiker, Ibrahima Faye, and Aamir Malik. 2014. Differentiation of pupillary signals using statistical and functional analysis. In Intelligent and Advanced Systems (ICIAS), 2014 5th International Conference on. IEEE, 1--6.Google ScholarGoogle ScholarCross RefCross Ref
  6. Margaret M Bradley and Peter J Lang. 1994. Measuring emotion: the self-assessment manikin and the semantic differential. Journal of behavior therapy and experimental psychiatry 25, 1 (1994), 49--59.Google ScholarGoogle ScholarCross RefCross Ref
  7. Margaret M Bradley, Laura Miccoli, Miguel A Escrig, and Peter J Lang. 2008. The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45, 4 (2008), 602--607.Google ScholarGoogle ScholarCross RefCross Ref
  8. Carlos Busso, Zhigang Deng, Serdar Yildirim, Murtaza Bulut, Chul Min Lee, Abe Kazemzadeh, Sungbok Lee, Ulrich Neumann, and Shrikanth Narayanan. 2004. Analysis of emotion recognition using facial expressions, speech and multimodal information. In Proceedings of the 6th international conference on Multimodal interfaces. ACM, 205--211. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Roddy Cowie, Ellen Douglas-Cowie, Nicolas Tsapatsoulis, George Votsis, Stefanos Kollias, Winfried Fellenz, and John G Taylor. 2001. Emotion recognition in human-computer interaction. IEEE Signal processing magazine 18, 1 (2001), 32--80.Google ScholarGoogle Scholar
  10. Mai ElKomy, Yomna Abdelrahman, Markus Funk, Tilman Dingler, Albrecht Schmidt, and Slim Abdennadher. 2017. ABBAS: An Adaptive Bio-sensors Based Assistive System. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 2543--2550. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. R Hamed, Adham Atyabi, Antti Rantanen, Seppo J Laukka, Samia Nefti-Meziani, Janne Heikkilä, and others. 2015. Predicting the valence of a scene from observers' eye movements. PloS one 10, 9 (2015), e0138198.Google ScholarGoogle Scholar
  12. Jennifer Healey and Rosalind Picard. 2000. SmartCar: detecting driver stress. In Pattern Recognition, 2000. Proceedings. 15th International Conference on, Vol. 4. IEEE, 218--221. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Cory S Inman. 2010. Experience-related eye movements and pupillary responses reflect declarative memory for emotional and neutral pictures. Ph.D. Dissertation. Emory University.Google ScholarGoogle Scholar
  14. Alejandro Jaimes and Nicu Sebe. 2007. Multimodal human-computer interaction: A survey. Computer vision and image understanding 108, 1 (2007), 116--134. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Shuhei Kawai, Hironobu Takano, and Kiyomi Nakamura. 2013. Pupil diameter variation in positive and negative emotions with visual stimulus. In Systems, Man, and Cybernetics (SMC), 2013 IEEE International Conference on. IEEE, 4179--4183. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Kyung Hwan Kim, Seok Won Bang, and Sang Ryong Kim. 2004. Emotion recognition system using short-term monitoring of physiological signals. Medical and biological engineering and computing 42, 3 (2004), 419--427.Google ScholarGoogle Scholar
  17. Zhilei Liu and Shangfei Wang. 2011. Emotion recognition using hidden Markov models from facial temperature sequence. Affective computing and intelligent interaction (2011), 240--247. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Yifei Lu, Wei-Long Zheng, Binbin Li, and Bao-Liang Lu. 2015. Combining Eye Movements and EEG to Enhance Emotion Recognition.. In IJCAI. 1170--1176. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Timo Partala and Veikko Surakka. 2003. Pupil size variation as an indication of affective processing. International journal of human-computer studies 59, 1 (2003), 185--198. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Alireza Sahami Shirazi, Markus Funk, Florian Pfleiderer, Hendrik Glück, and Albrecht Schmidt. 2012. MediaBrain: Annotating Videos based on Brain-Computer Interaction.. In Mensch & Computer. 263--272.Google ScholarGoogle Scholar
  21. Shiv Naresh Shivhare and Saritha Khethawat. 2012. Emotion detection from text. arXiv preprint arXiv.1205.4944 (2012).Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Other conferences
    MUM '17: Proceedings of the 16th International Conference on Mobile and Ubiquitous Multimedia
    November 2017
    567 pages
    ISBN:9781450353786
    DOI:10.1145/3152832

    Copyright © 2017 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 26 November 2017

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • short-paper

    Acceptance Rates

    Overall Acceptance Rate190of465submissions,41%

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader