skip to main content
10.1145/3372278.3390738acmconferencesArticle/Chapter ViewAbstractPublication PagesicmrConference Proceedingsconference-collections
short-paper

Emotion Recognition from Galvanic Skin Response Signal Based on Deep Hybrid Neural Networks

Authors Info & Claims
Published:08 June 2020Publication History

ABSTRACT

Emotion reacts human beings' physiological and psychological status. Galvanic Skin Response (GSR) can reveal the electrical characteristics of human skin and is widely used to recognize the presence of emotion. In this work, we propose an emotion recognition frame-work based on deep hybrid neural networks, in which 1D CNN and Residual Bidirectional GRU are employed for time series data analysis. The experimental results show that the proposed method can outperform other state-of-the-art methods. In addition, we port the proposed emotion recognition model on Raspberry Pi and design a real-time emotion interaction robot to verify the efficiency of this work.

References

  1. Atif Alamri. 2018. Monitoring system for patients using multimedia for smart healthcare. IEEE Access6 (2018), 23271--23276.Google ScholarGoogle ScholarCross RefCross Ref
  2. Anna Aljanaki, Yi-Hsuan Yang, and Mohammad Soleymani. 2017. Developing a benchmark for emotional analysis of music. PloS one 12, 3 (2017), e0173392.Google ScholarGoogle ScholarCross RefCross Ref
  3. Sarah E Garcia and Laura M Hammond. 2016. Capturing & Measuring Emotions in UX. In Proceedings of the ACM CHI Conference Extended Abstracts on Human Factors in Computing Systems. 777--785.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. A. Greco, G. Valenza, A. Lanata, E. P. Scilingo, and L. Citi. 2016. cvxEDA: A Convex Optimization Approach to Electrodermal Activity Processing. IEEE Transactions on Biomedical Engineering63, 4 (2016), 797--804.Google ScholarGoogle Scholar
  5. Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep residual learning for image recognition. In Proceedings of the IEEE conference on Computer Vision and Pattern Recognition. 770--778.Google ScholarGoogle ScholarCross RefCross Ref
  6. Wei Jiang, Zheng Wang, Jesse S Jin, Xianfeng Han, and Chunguang Li. 2019. Speech Emotion Recognition with Heterogeneous Feature Unification of Deep Neural Network. Sensors19, 12 (2019), 2730.Google ScholarGoogle Scholar
  7. Gil Keren, Tobias Kirschstein, Erik Marchi, Fabien Ringeval, and Björn Schuller. 2017. End-to-end learning for dimensional emotion recognition from physiological signals. In Proceedings of IEEE International Conference on Multimedia and Expo. 985--990.Google ScholarGoogle ScholarCross RefCross Ref
  8. Ruhul Amin Khalil, Edward Jones, Mohammad Inayatullah Babar, Tariqullah Jan, Mohammad Haseeb Zafar, and Thamer Alhussain. 2019. Speech emotion recognition using deep learning techniques: A review. IEEE Access 7 (2019), 117327--117345.Google ScholarGoogle ScholarCross RefCross Ref
  9. Diederik P. Kingma and Jimmy Ba. 2015. Adam: A Method for Stochastic Optimization. In Proceedings of International Conference on Learning Representations.Google ScholarGoogle Scholar
  10. Felix Klotzsche, Alberto Mariola, Simon Hofmann, Vadim V Nikulin, Arno Villringer, and Michael Gaebler. 2018. Using EEG to decode subjective levels of emotional arousal during an immersive VR roller coaster ride. In Proceedings of IEEE Conference on Virtual Reality and 3D User Interfaces (VR).Google ScholarGoogle ScholarCross RefCross Ref
  11. Takurou Magaki and Michael Vallance. 2019. Developing an Accessible Evaluation Method of VR Cybersickness. In Proceedings of IEEE Conference on Virtual Reality and 3D User Interfaces (VR).Google ScholarGoogle ScholarCross RefCross Ref
  12. Yoko Nagai, Christopher Iain Jones, and Arjune Sen. 2019. Galvanic skin response (GSR)/electrodermal/skin conductance biofeedback on epilepsy: a systematic review and meta-analysis. Frontiers in neurology10 (2019), 377.Google ScholarGoogle Scholar
  13. Najmeh Samadiani, Guangyan Huang, Borui Cai, Wei Luo, Chi-Hung Chi, Yong Xiang, and Jing He. 2019. A review on automatic facial expression recognition systems assisted by multimodal sensor data. Sensors 19, 8 (2019), 1863.Google ScholarGoogle ScholarCross RefCross Ref
  14. M. Schuster and K. K. Paliwal. 1997. Bidirectional recurrent neural networks. IEEE Transactions on Signal Processing 45, 11 (1997), 2673--2681.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Lin Shu, Jinyan Xie, Mingyue Yang, Ziyi Li, Zhenqi Li, Dan Liao, Xiangmin Xu, and Xinyi Yang. 2018. A review of emotion recognition using physiological signals. Sensors18, 7 (2018), 2074.Google ScholarGoogle Scholar
  16. J. Shukla, M. Barreda-Angeles, J. Oliver, G. C. Nandi, and D. Puig. 2019. Feature Extraction and Selection for Emotion Recognition from Electrodermal Activity. IEEE Transactions on Affective Computing(2019), 1--1.Google ScholarGoogle Scholar
  17. Goran Udovicic, Jurica Ðerek, Mladen Russo, and Marjan Sikora. 2017. Wearable Emotion Recognition System Based on GSR and PPG Signals. In Proceedings of the 2Nd International Workshop on Multimedia for Personal Health and Health Care (MMHealth '17). 53--59.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Wei Wei, Qingxuan Jia, Feng Yongli, and Gang Chen. 2018. Emotion Recognition Based on Weighted Fusion Strategy of Multichannel Physiological Signals. Computational Intelligence and Neuroscience 2018 (07 2018), 1--9.Google ScholarGoogle Scholar
  19. G. Yin, S. Sun, H. Zhang, D. Yu, C. Li, K. Zhang, and N. Zou. 2019. User Independent Emotion Recognition with Residual Signal-Image Network. In Proceedings of IEEE International Conference on Image Processing (ICIP). 3277--3281.Google ScholarGoogle Scholar
  20. Zhong Yin, Yongxiong Wang, Li Liu, Wei Zhang, and Jianhua Zhang. 2017. Cross-Subject EEG Feature Selection for Emotion Recognition Using Transfer Recursive Feature Elimination. In Front. Neurorobot.Google ScholarGoogle Scholar
  21. Kejun Zhang, Hui Bin Zhang, Simeng Li, Chang yuan Yang, and Lingyun Sun. 2018. The PMEmo Dataset for Music Emotion Recognition. In Proceedings of the ACM International Conference on Multimedia Retrieval. 135--142.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Emotion Recognition from Galvanic Skin Response Signal Based on Deep Hybrid Neural Networks

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        ICMR '20: Proceedings of the 2020 International Conference on Multimedia Retrieval
        June 2020
        605 pages
        ISBN:9781450370875
        DOI:10.1145/3372278

        Copyright © 2020 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 8 June 2020

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • short-paper

        Acceptance Rates

        Overall Acceptance Rate254of830submissions,31%

        Upcoming Conference

        ICMR '24
        International Conference on Multimedia Retrieval
        June 10 - 14, 2024
        Phuket , Thailand

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader