skip to main content
10.1145/3590003.3590041acmotherconferencesArticle/Chapter ViewAbstractPublication PagescacmlConference Proceedingsconference-collections
research-article

An Emotion Recognition Method Based On Feature Fusion and Self-Supervised Learning

Published: 29 May 2023 Publication History

Abstract

Emotional diseases being represented in many kinds of human mental and cardiac problems, demanding requirements are imposed on accurate emotion recognition. Deep learning methods have gained widespread application in the field of emotion recognition, utilizing physiological signals. However, many existing methods rely solely on deep features, which can be difficult to interpret and may not provide a comprehensive understanding of physiological signals. To address this issue, we propose a novel emotion recognition method based on feature fusion and self-supervised learning. This approach combines shallow features and deep learning features, resulting in a more holistic and interpretable approach to analyzing physiological signals. In addition, we transferred the self-supervised learning method from processing images to signals, which learns sophisticated and informative features from unlabeled signal data. Our experimental results are conducted on WESAD, a publicly available dataset and the proposed model shows significant improvement in performance, which confirms the superiority of our proposed method compared to state-of-the-art methods.

References

[1]
[1] Maria TeresaLa Rovere, Alessandra Gorini, Peter J.Schwartz. 2022. Stress, the autonomic nervous system, and sudden death. Autonomic Neuroscience. Volume. 237.
[2]
[2] Y. Hsu, J. Wang, W. Chiang, and C. Hung. 2020. Automatic ECG-based emotion recognition in music listening. IEEE Transactions on Affective Computing. Volume. 11(1), pp. 85–99.
[3]
[3] Jennifer Healey and Rosalind W Picard. 2005. Detecting stress during real-world driving tasks using physiological sensors. IEEE Transactions on Intelligent Transportation Systems. Volume. 6(2), pp. 156–166.
[4]
[4] Philip Schmidt, Attila Reiss, Robert Duerichen, Claus Marberger, and Kristof Van Laerhoven. 2018. Introducing wesad, a multimodal dataset for wearable stress and affect detection. In Proceedings of the 20th International Conference on Multimodal Interaction. pp. 400–408.
[5]
[5] H. Cui, A. Liu, X. Zhang, X. Chen, K. Wang, X. Chen. 2020. EEG-based emotion recognition using an end-to-end regional-asymmetric convolutional neural network. Knowl. Based Syst. Volume 205, pp. 106243.
[6]
[6] Jionghao Lin, Shirui Pan, Cheng Siong Lee, Sharon Oviatt. 2019. An Explainable Deep Fusion Network for Affect Recognition Using Physiological Signals. In Proceedings of the 28th International Conference on Information and Knowledge Management. pp. 2069–2072.
[7]
[7] Pritam Sarkar, Ali Etemad. 2021. Self-supervised ECG Representation Learning for Emotion Recognition. IEEE Transactions on Affective Computing. Volume 13(3), pp. 1541–1554.
[8]
[8] Terry Taewoong Um, Franz Michael, Josef Pfister, Daniel Pichler, Satoshi Endo, Muriel Lang, Sandra Hirche et al. 2017. Data augmentation of wearable sensor data for parkinson’s disease monitoring using convolutional neural networks. arXiv preprint arXiv:1706.00527.
[9]
[9] J. Hu, L. Shen, S. Albanie, G. Sun, and E. Wu. 2020. Squeeze-and-excitation networks. IEEE Trans. Pattern Anal. Mach. Intell., Volume. 42(8), pp. 2011–2023.
[10]
[10] A. Alshehri, Y. Bazi, N. Ammour, H. Almubarak, and N. Alajlan. 2019. Deep attention neural network for multi-label classification in unmanned aerial vehicle imagery. IEEE Access. Volume. 7, pp. 119873–119880.
[11]
[11] Pritam Sarkar, Kyle Ross, Aaron J Ruberto, Dirk Rodenbura, Paul Hungler, and Ali Etemad. 2019. Classification of cognitive load and expertise for adaptive simulation using deep multitask learning. In 8th IEEE International Conference on Affective Computing and Intelligent Interaction, pp. 1–7.
[12]
[12] Henry Friday Nweke, Teh Ying Wah, Ghulam Mujtaba. 2019. Data fusion and multiple classifier systems for human activity detection and health monitoring: Review and open research directions. Information Fusion, Volume 46, pp. 147–170.

Index Terms

  1. An Emotion Recognition Method Based On Feature Fusion and Self-Supervised Learning

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    CACML '23: Proceedings of the 2023 2nd Asia Conference on Algorithms, Computing and Machine Learning
    March 2023
    598 pages
    ISBN:9781450399449
    DOI:10.1145/3590003
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 29 May 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. emotion recognition
    2. feature fusion
    3. physiological signals
    4. self-supervised learning

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    • Science and Technology Department of Sichuan Province of China

    Conference

    CACML 2023

    Acceptance Rates

    CACML '23 Paper Acceptance Rate 93 of 241 submissions, 39%;
    Overall Acceptance Rate 93 of 241 submissions, 39%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 60
      Total Downloads
    • Downloads (Last 12 months)15
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 27 Feb 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media