Skip to main content
Log in

EEG-based Emotion Recognition with Feature Fusion Networks

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

With the rapid development of Human-computer interaction, automatic emotion recognition based on multichannel electroencephalography (EEG) signals has attracted much attention in recent years. However, many existing studies on EEG-based emotion recognition ignore the correlation information between different EEG channels and cannot fully capture the contextual information of EEG signals. In this paper, a novel multi-feature fusion network is proposed, which consists of spatial and temporal neural network structures to learn discriminative spatio-temporal emotional information to recognize emotion. In this experiment, two common types of features, time-domain features (Hjorth, Differential Entropy, Sample Entropy) and frequency domain features (Power Spectral Density), are extracted. Then, to learn the spatial and contextual information, a convolution neural network, inspired by GoogleNet with inception structure, was adopted to capture the intrinsic spatial relationship of EEG electrodes and contextual information, respectively. Fully connected layers are used for feature fusion, instead of the SoftMax function, SVM is selected to classify the high-level emotion features. Finally, to evaluate the proposed method, we conduct leave-one-subject-out EEG emotion recognition experiments on the DEAP dataset, and the experiment results show that the proposed method achieves excellent performance and average emotion recognition accuracies of 80.52% and 75.22% in the valence and arousal classification tasks of the DEAP database, respectively.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Cowie R et al (2001) Emotion recognition in human-computer interaction. IEEE Signal Process Mag 18(1):32–80

    Article  Google Scholar 

  2. Ekman P, Friesen WV (1978) Facial action coding system (facs): a technique for the measurement of facial actions. Rivista Di Psichiatria 47(2):126–38

    Google Scholar 

  3. Bradley Margaret M, Lang PJ (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatry 25(1):49–59

    Article  Google Scholar 

  4. Mehrabian A, Russell JA (1974) An approach to environmental psychology[M]. MIT

  5. Li D, Wang Z, Gao Q, Song Y, Yu X, Wang C (2019) Facial expression recognition based on Electroencephalogram and facial landmark localization. Technol Health Care 27(4):373–387

    Article  Google Scholar 

  6. Ho N, Yang H, Kim S, Lee G (2020) Multimodal approach of speech emotion recognition using multi-level multi-attention-based recurrent neural network. IEEE Access 8:61672–61686

    Article  Google Scholar 

  7. Yang Y, Gao Q, Song X, Song Y, Mao Z, Liu J (2021) Facial expression and EEG fusion for investigating continuous emotions of deaf subjects. IEEE Sens J 21(15):16894–16903. https://doi.org/10.1109/JSEN.2021.3078087

  8. Li D et al (2019) The fusion of electroencephalography and facial expression for continuous emotion recognition. IEEE Access 7:155724–155736

    Article  Google Scholar 

  9. Alarcao Soraia M, Fonseca Manuel J (2019) Emotions recognition using EEG signals: a survey. IEEE transactions on affective computing 10(3):374–393

  10. Hosseini SA, Naghibi-Sistani MB (2011) Emotion recognition method using entropy analysis of EEG signals. Int J Image Graphics Signal Process 3(5):30

    Article  Google Scholar 

  11. Duan R, Wang X, Lu B (2012) EEG-Based emotion recognition in listening music by using support vector machine and linear dynamic system. In: Proceedings of the 19th international conference on Neural Information Processing, pp 468–475

  12. Duan R, Zhu J, Lu B (2013) Differential entropy feature for EEG-based emotion classification. In: 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER), pp 81–84

  13. Li X, Song D, Zhang P, Zhang Y, Hou Y, Hu B (2018) Exploring EEG features in cross-subject emotion recognition. Front Neurosci 12:162

    Article  Google Scholar 

  14. Yang F, Zhao X, Jiang W, Gao P, Liu G (2019) Multi-method fusion of cross-subject emotion recognition based on high-dimensional EEG features. Front Comput Neurosci 13:53

    Article  Google Scholar 

  15. Suwicha, J, Setha, P. N, & Pasin, I (2014) Eeg-based emotion recognition using deep learning network with principal component based covariate shift adaptation. Sci World J: 627892

  16. S. Stober, A. Sternin, A. M. Owen, and J. A. Grahn (2015) Deep feature learning for EEG recordings 165:23–31. arXiv:1511.04306

  17. Lawhern VJ, Solon AJ, Waytowich NR, Gordon S, Hung CP, Lance BJ (2018) EEGNet: a compact convolutional neural network for EEG-based brain-computer interfaces. J Neural Eng 15(5):056013.1–056013.17

  18. Salama ES, Elkhoribi RA, Shoman M, Shalaby MAW (2018) EEG-based emotion recognition using 3D convolutional neural networks. Int J Adv Comput Sci Appl 9(8):329–337

    Google Scholar 

  19. Nakisa B, Rastgoo MN, Rakotonirainy A, Maire F, Chandran V (2018) Long short term memory hyperparameter optimization for a neural network based emotion recognition framework. IEEE Access 6:49325–49338

    Article  Google Scholar 

  20. Donmez H, Ozkurt N (2019) Emotion classification from EEG signals in convolutional neural networks 2019. In: Innovations in Intelligent Systems and Applications Conference (ASYU), pp 46364

  21. Hwang S, Ki M, Hong K, Byun H (2020) Subject-independent EEG-based emotion recognition using adversarial learning. In: 2020 8th International Winter Conference on Brain-Computer Interface (BCI) pp 1–4

  22. Koelstra S et al (2012) DEAP: a database for emotion analysis: using physiological signals. IEEE Trans Affective Comput 3(1):18–31

    Article  Google Scholar 

  23. Hjorth B (1970) EEG analysis based on time domain properties. Electroencephalography Clin Neurophysiol 29(3):306–310

    Article  Google Scholar 

  24. Lecun Y, Bottou L (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324

    Article  Google Scholar 

  25. Xu H, Plataniotis KN (2016) “EEG-based affect states classification using deep belief networks. In: Proc. Digit. Media Ind. Acad. Forum (DMIAF), pp 148–153

  26. Chen JX, Jiang DM, Zhang YN (2019) A hierarchical bidirectional GRU model with attention for EEG-based emotion classification. IEEE Access 7:118530–118540

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yu Song.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gao, Q., Yang, Y., Kang, Q. et al. EEG-based Emotion Recognition with Feature Fusion Networks. Int. J. Mach. Learn. & Cyber. 13, 421–429 (2022). https://doi.org/10.1007/s13042-021-01414-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-021-01414-5

Keywords

Navigation