skip to main content
10.1145/3369457.3369481acmotherconferencesArticle/Chapter ViewAbstractPublication PagesozchiConference Proceedingsconference-collections
short-paper

Viewer Arousal Display System Using Eye-Tracking and Skin Conductance Data

Published: 10 January 2020 Publication History

Abstract

We developed a viewer arousal display system that can represent the viewers' physiological responses about emotional arousal in fixation areas on frames of a viewed video by measuring and recording viewers' physiological signals and eye movements. We implemented two prototypes of the system. One of the prototypes superimposes the information in position, hue, and saturation of heat maps on the video. The second prototype represents the information with position, height, and color of bar graphs around the video. The system can solve a problem in previous systems, which were unable to display an important area in a frame of a video and can be useful for video creators or analyzers to estimate and analyze viewers' responses to a video. Additionally, we conducted an experiment to verify the effectiveness of the system. The results demonstrate that the fixation areas that the system calculated from the eye-tracking generally overlapped with the subjective reported attention points. However, the subjective arousal values varied slightly from the estimated arousal values from the SC data.

References

[1]
A. O. de Guinea, R. Titah and P.-M. Léger. 2014. Explicit and Implicit Antecedents of Users' Behavioral Beliefs in Information Systems: A Neuropsychological Investigation. In Journal of Management Information Systems, 30(4), 179--210.
[2]
M. D. Robinson and G. L. Clore. 2002. Episodic and Semantic Knowledge in Emotional Self-report: Evidence for Two Judgment Processes. In Journal of Personality and Social Psychology, 83(1), 198--215.
[3]
E. Oliveira, P. Martins and T. Chambel. 2011. iFelt: Accessing Movies through Our Emotions. In Proceedings of the 9th European Conference on Interactive TV and Video. ACM, New York, NY, USA, 105--114.
[4]
T. Shirokura, N. Munekata and T. Ono. 2014. ExciTube - Video Player for Sharing Viewer's Excitement. In Proceedings of the International Conference on Physiological Computing Systems 315--322.
[5]
Affectiva. 2016. Affdex, https://www.affectiva.com/product/affdex-for-market-research/.
[6]
ThirdSight. EmoVision, https://thirdsight.pr.co/31963-emotion-recognition-software-will-save-brands-hundreds-of-millions.
[7]
V. Georges, F. Courtemanche, S. Senecal, T. Baccino, M. Fredette, and P-M. Léger. 2016. UX Heatmaps: Mapping User Experience on Visual Interfaces. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI'16). ACM, New York, NY, USA, 4850--4860.
[8]
Y, Ikeda, Y. Okada, Y. Someya, Y. Takahashi, T. Tanaka, R. Yoshida and M. Sugiya. 2017. Awareness of Emotion Through Emotional Visualization Using Biometric Information. In Proceedings of IPSJ Interaction 2017. 634--637. (in Japanese).
[9]
A. Voßkühler, V. Nordmeier and L. Kuchinke. 2009. OGAMA-Open Gaze and Mouse Analyzer (Poster). In S. P. Liversedge, Proceedings of the European Conference on Eye Movements, ECEM 2009. Southampton, UK.
[10]
H. Miyata as editor. 1998. Shin Seiri Shinrigaku 1 (New Physiological Psychology 1). Kitaohji Shobo, Kyoto, Japan, 210--219. (in Japanese)
[11]
N. Ohmori, and Y. Wada. 2009. Gender Differences in Color Preference and the Psychological Effect of Color. In The Bulletin of Health Science University, 5, 67--76. (in Japanese).
[12]
The Vision Society of Japan. 2000. Handbook of Visual Information Processing. Asakura-shoten, Tokyo, 2291--231. (in Japanese).
[13]
M. M. Bradley and P. J. Lang. 1994. Measuring Emotion: The Self-Assessment Manikin and the Semantic Differential. In Journal of Behavior Therapy and Experimental Psychiatry, 25(1), 49--59.

Index Terms

  1. Viewer Arousal Display System Using Eye-Tracking and Skin Conductance Data

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    OzCHI '19: Proceedings of the 31st Australian Conference on Human-Computer-Interaction
    December 2019
    631 pages
    ISBN:9781450376969
    DOI:10.1145/3369457
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    • HFESA: Human Factors and Ergonomics Society of Australia Inc.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 10 January 2020

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Arousal
    2. Eye-tracking
    3. Skin Conductance
    4. Viewer Response

    Qualifiers

    • Short-paper
    • Research
    • Refereed limited

    Conference

    OZCHI'19
    OZCHI'19: 31ST AUSTRALIAN CONFERENCE ON HUMAN-COMPUTER-INTERACTION
    December 2 - 5, 2019
    WA, Fremantle, Australia

    Acceptance Rates

    Overall Acceptance Rate 362 of 729 submissions, 50%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 71
      Total Downloads
    • Downloads (Last 12 months)4
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 20 Jan 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media