skip to main content
10.1145/3369457.3369516acmotherconferencesArticle/Chapter ViewAbstractPublication PagesozchiConference Proceedingsconference-collections
short-paper

Measuring Observers' EDA Responses to Emotional Videos

Published: 10 January 2020 Publication History

Abstract

Future human computing research could be enriched by enabling the computer to recognize emotional states from observers' physiological activities. In this paper, observers' electrodermal activities (EDA) are analyzed to recognize 7 emotional categories while watching total of 80 emotional videos. Twenty participants participated as observers and 16 features were extracted from each video's respective EDA signal after a few processing steps. Mean analysis shows that a few emotions are significantly different from each other, but not all of them. Our generated arousal model on this dataset with these participants using their EDA responses also differs a little from the abstract models proposed in the literature. Finally, leave-one-observer-out approach and neural network classifier were employed to measure the performance, and the classifier reaches up to 94.8% correctness at the seven-class problem. The high accuracy inspires the potential of this system to use in future for recognizing emotions from observers' physiology in human computer interaction settings. Our generation of an arousal model for a specific setting has potential for investigating potential bias in dataset selection via measuring participant responses to that dataset.

References

[1]
U Rajendra Acharya, Yuki Hagiwara, Sunny Nitin Deshpande, S Suren, Joel En Wei Koh, Shu Lih Oh, N Arunkumar, Edward J Ciaccio, and Choo Min Lim. 2018. Characterization of focal EEG signals: a review. Future Generation Computer Systems (2018).
[2]
Rubana Chowdhury, Mamun Reaz, Mohd Ali, Ashrif Bakar, Kalaivani Chellappan, and Tae Chang. 2013. Surface electromyography signal processing and classification techniques. Sensors 13, 9 (2013), 12431--12466.
[3]
Abhinav Dhall, Roland Goecke, Simon Lucey, and Tom Gedeon. 2011. Static facial expression analysis in tough conditions: Data, evaluation protocol and benchmark. Proceedings of the IEEE International Conference on Computer Vision, 2106--2112. https://doi.org/10.1109/ICCVW.2011.6130508
[4]
E4 [n.d.]. E4 Wristband from Empatica. https://www.empatica.com/research/e4/
[5]
Yin Fan, Xiangju Lu, Dian Li, and Yuanliu Liu. 2016. Video-based Emotion Recognition Using CNN-RNN and C3D Hybrid Networks. In Proceedings of the 18th ACM International Conference on Multimodal Interaction (ICMI '16). ACM, New York, NY, USA, 445--450. https://doi.org/10.1145/2993148.2997632
[6]
Elia Gatti, Elena Calzolari, Emanuela Maggioni, and Marianna Obrist. 2018. Emotional ratings and skin conductance response to visual, auditory and haptic stimuli. In Scientific data.
[7]
Geometric & Harmonic Means in Data Analysis [n.d.]. On Average, You're Using the Wrong Average: Geometric & Harmonic Means in Data Analysis. https://towardsdatascience.com/on-average-youre-using-the-wrong-average-geometric-harmonic-means-in-data-analysis-2a703e21ea0
[8]
MZ Hossain, T Gedeon, R Sankaranarayana, D Apthorp, and A Dawel. [n.d.]. Pupillary responses of Asian observers in discriminating real from fake smiles: A preliminary study.
[9]
Md Zakir Hossain, Tom Gedeon, and Ramesh Sankaranarayana. 2018. Using temporal features of observers' physiological measures to distinguish between genuine and fake smiles. IEEE Transactions on Affective Computing (2018).
[10]
Eun-Hye Jang, Byoung-Jun Park, Mi-Sook Park, Sang-Hyeob Kim, and Jin-Hun Sohn. 2015. Analysis of physiological signals for recognition of boredom, pain, and surprise emotions. Journal of physiological anthropology 34, 1 (2015), 25.
[11]
Jonghwa Kim and Elisabeth André. 2008. Emotion recognition based on physiological changes in music listening. IEEE transactions on pattern analysis and machine intelligence 30, 12 (2008), 2067--2083.
[12]
K. H. Kim, S. W. Bang, and S. R. Kim. 2004. Emotion recognition system using short-term monitoring of physiological signals. Medical and Biological Engineering and Computing 42, 3 (01 May 2004), 419--427. https://doi.org/10.1007/BF02344719
[13]
Chuanhe Liu, Tianhao Tang, Kui Lv, and Minghao Wang. 2018. Multi-Feature Based Emotion Recognition for Video Clips. In Proceedings of the 20th ACM International Conference on Multimodal Interaction (ICMI '18). ACM, New York, NY, USA, 630--634. https://doi.org/10.1145/3242969.3264989
[14]
Rosalind W. Picard, Elias Vyzas, and Jennifer Healey. 2001. Toward machine emotional intelligence: Analysis of affective physiological state. IEEE Transactions on Pattern Analysis & Machine Intelligence 10 (2001), 1175--1191.
[15]
J. S. Rahman, T. Gedeon, S. Caldwell, R. Jones, M. Z. Hossain, and X. Zhu. 2019. Melodious Micro-frissons: Detecting Music Genres From Skin Response. In 2019 International Joint Conference on Neural Networks (IJCNN). 1--8. https://doi.org/10.1109/IJCNN.2019.8852318
[16]
James A Russell. 1980. A circumplex model of affect. Journal of personality and social psychology 39, 6 (1980), 1161.
[17]
Jerritta Selvaraj, Murugappan M, R Nagarajan, and Wan Khairunizam. 2011. Physiological signals based human emotion Recognition: a review. Proceedings - 2011 IEEE 7th International Colloquium on Signal Processing and Its Applications, CSPA 2011 (03 2011). https://doi.org/10.1109/CSPA.2011.5759912
[18]
Nandita Sharma and Tom Gedeon. 2013. Computational models of stress in reading using physiological and physical sensor data. In Pacific-Asia Conference on Knowledge Discovery and Data Mining. Springer, 111--122.
[19]
Triwiyanto Triwiyanto, Oyas Wahyunggoro, Hanung Adi Nugroho, and Herianto Herianto. 2017. An investigation into time domain features of surface electromyography to estimate the elbow joint angle. Advances in Electrical and Electronic Engineering 15, 3 (2017), 448--458.
[20]
Johannes Wagner, Jonghwa Kim, and Elisabeth André. 2005. From physiological signals to emotions: Implementing and comparing selected methods for feature extraction and classification. In 2005 IEEE international conference on multimedia and expo. IEEE, 940--943.
[21]
Robert Sessions Woodworth and Harold Schlosberg. 1954. Experimental psychology. Oxford and IBH Publishing.

Cited By

View all
  • (2024)Evaluating Real-Time Emotional Responses Using Bullet Screen Sentiment Analysis: Evidence from Electrodermal ActivityHCI International 2024 – Late Breaking Papers10.1007/978-3-031-76806-4_18(240-253)Online publication date: 17-Dec-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
OzCHI '19: Proceedings of the 31st Australian Conference on Human-Computer-Interaction
December 2019
631 pages
ISBN:9781450376969
DOI:10.1145/3369457
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

In-Cooperation

  • HFESA: Human Factors and Ergonomics Society of Australia Inc.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 10 January 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Arousal Model
  2. Electrodermal Activity
  3. Emotion Recognition
  4. Neural Network

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Conference

OZCHI'19
OZCHI'19: 31ST AUSTRALIAN CONFERENCE ON HUMAN-COMPUTER-INTERACTION
December 2 - 5, 2019
WA, Fremantle, Australia

Acceptance Rates

Overall Acceptance Rate 362 of 729 submissions, 50%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)40
  • Downloads (Last 6 weeks)1
Reflects downloads up to 20 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Evaluating Real-Time Emotional Responses Using Bullet Screen Sentiment Analysis: Evidence from Electrodermal ActivityHCI International 2024 – Late Breaking Papers10.1007/978-3-031-76806-4_18(240-253)Online publication date: 17-Dec-2024

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media