ABSTRACT
Most of the typical digital systems are not fully aware of the users' affect states. Adapting to the users' state showed great potential for enhancing user experiences. However, most approaches for sensing affective states, specifically arousal and valence, involve expensive and obtrusive technologies, such as physiological sensors attached to users' bodies. This paper present an indicator of the users' affect based on eye tracking. We use a commercial eye tracker to monitor the user's pupil size to estimate their arousal and valence in response to videos of different content. To assess the effect of different content (namely pleasant and unpleasant) influencing the arousal and valence on the pupil diameter, we conducted a user study with 25 participants. The study showed that different content of videos affect the pupil diameter, thereby giving an indicator about the user's state. We provide empirical evidence showing how to unobtrusively detect changes in users' state. Our initial investigation gives rise to eye-based user's tracking, which introduces the potential of new applications in the field of affect-aware computing.
- Yomna Abdelrahman, Mariam Hassib, Maria Guinea Marquez, Markus Funk, and Albrecht Schmidt. 2015. Implicit engagement detection for interactive museums using brain-computer interfaces. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct. ACM, 838--845. Google ScholarDigital Library
- Yomna Abdelrahman, Eduardo Velloso, Tilman Dingier, Albrecht Schmidt, and Frank Vetere. 2017. Cognitive Heat: Exploring the Usage of Thermal Imaging to Unobtrusively Estimate Cognitive Load. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 33 (2017), 20. Google ScholarDigital Library
- Claudio Aracena, Sebastián Basterrech, Václav Snáel, and Juan Velásquez. 2015. Neural Networks for Emotion Recognition Based on Eye Tracking Data. In Systems, Man, and Cybernetics (SMC), 2015 IEEE International Conference on. IEEE, 2632--2637.Google Scholar
- Areej Babiker, Ibrahima Faye, and Aamir Malik. 2013. Pupillary behavior in positive and negative emotions. In Signal and Image Processing Applications (ICSIPA), 2013 IEEE International Conference on. IEEE, 379--383.Google ScholarCross Ref
- Areej Babiker, Ibrahima Faye, and Aamir Malik. 2014. Differentiation of pupillary signals using statistical and functional analysis. In Intelligent and Advanced Systems (ICIAS), 2014 5th International Conference on. IEEE, 1--6.Google ScholarCross Ref
- Margaret M Bradley and Peter J Lang. 1994. Measuring emotion: the self-assessment manikin and the semantic differential. Journal of behavior therapy and experimental psychiatry 25, 1 (1994), 49--59.Google ScholarCross Ref
- Margaret M Bradley, Laura Miccoli, Miguel A Escrig, and Peter J Lang. 2008. The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45, 4 (2008), 602--607.Google ScholarCross Ref
- Carlos Busso, Zhigang Deng, Serdar Yildirim, Murtaza Bulut, Chul Min Lee, Abe Kazemzadeh, Sungbok Lee, Ulrich Neumann, and Shrikanth Narayanan. 2004. Analysis of emotion recognition using facial expressions, speech and multimodal information. In Proceedings of the 6th international conference on Multimodal interfaces. ACM, 205--211. Google ScholarDigital Library
- Roddy Cowie, Ellen Douglas-Cowie, Nicolas Tsapatsoulis, George Votsis, Stefanos Kollias, Winfried Fellenz, and John G Taylor. 2001. Emotion recognition in human-computer interaction. IEEE Signal processing magazine 18, 1 (2001), 32--80.Google Scholar
- Mai ElKomy, Yomna Abdelrahman, Markus Funk, Tilman Dingler, Albrecht Schmidt, and Slim Abdennadher. 2017. ABBAS: An Adaptive Bio-sensors Based Assistive System. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 2543--2550. Google ScholarDigital Library
- R Hamed, Adham Atyabi, Antti Rantanen, Seppo J Laukka, Samia Nefti-Meziani, Janne Heikkilä, and others. 2015. Predicting the valence of a scene from observers' eye movements. PloS one 10, 9 (2015), e0138198.Google Scholar
- Jennifer Healey and Rosalind Picard. 2000. SmartCar: detecting driver stress. In Pattern Recognition, 2000. Proceedings. 15th International Conference on, Vol. 4. IEEE, 218--221. Google ScholarDigital Library
- Cory S Inman. 2010. Experience-related eye movements and pupillary responses reflect declarative memory for emotional and neutral pictures. Ph.D. Dissertation. Emory University.Google Scholar
- Alejandro Jaimes and Nicu Sebe. 2007. Multimodal human-computer interaction: A survey. Computer vision and image understanding 108, 1 (2007), 116--134. Google ScholarDigital Library
- Shuhei Kawai, Hironobu Takano, and Kiyomi Nakamura. 2013. Pupil diameter variation in positive and negative emotions with visual stimulus. In Systems, Man, and Cybernetics (SMC), 2013 IEEE International Conference on. IEEE, 4179--4183. Google ScholarDigital Library
- Kyung Hwan Kim, Seok Won Bang, and Sang Ryong Kim. 2004. Emotion recognition system using short-term monitoring of physiological signals. Medical and biological engineering and computing 42, 3 (2004), 419--427.Google Scholar
- Zhilei Liu and Shangfei Wang. 2011. Emotion recognition using hidden Markov models from facial temperature sequence. Affective computing and intelligent interaction (2011), 240--247. Google ScholarDigital Library
- Yifei Lu, Wei-Long Zheng, Binbin Li, and Bao-Liang Lu. 2015. Combining Eye Movements and EEG to Enhance Emotion Recognition.. In IJCAI. 1170--1176. Google ScholarDigital Library
- Timo Partala and Veikko Surakka. 2003. Pupil size variation as an indication of affective processing. International journal of human-computer studies 59, 1 (2003), 185--198. Google ScholarDigital Library
- Alireza Sahami Shirazi, Markus Funk, Florian Pfleiderer, Hendrik Glück, and Albrecht Schmidt. 2012. MediaBrain: Annotating Videos based on Brain-Computer Interaction.. In Mensch & Computer. 263--272.Google Scholar
- Shiv Naresh Shivhare and Saritha Khethawat. 2012. Emotion detection from text. arXiv preprint arXiv.1205.4944 (2012).Google Scholar
Recommendations
PanoEmo, a set of affective 360-degree panoramas: a psychophysiological study
AbstractThere is a significant increase in the use of virtual reality in scientific experiments in the fields of ergonomics, education, and psychology among others. Many researchers successfully provoked different affective states in participants in order ...
Emotional clouds: Showing arousal and valence through the movement and darkness of digital cartoonish clouds
While nowadays the most usual way to show emotions in digital contexts is via virtual characters, its use may raise false expectations (the user attributes human abilities to the virtual character). This paper proposes and explores an approach to ...
Showing emotions through movement and symmetry
This paper proposes and explores a minimalist abstract approach to express emotions through movement and symmetry, which intends to minimize the user's expectations. Emotions are represented in terms of arousal and valence dimensions and they are ...
Comments