Skip to main content
Log in

Subject-independent EEG emotion recognition with hybrid spatio-temporal GRU-Conv architecture

  • Original Article
  • Published:
Medical & Biological Engineering & Computing Aims and scope Submit manuscript

Abstract

Recently, various deep learning frameworks have shown excellent performance in decoding electroencephalogram (EEG) signals, especially in human emotion recognition. However, most of them just focus on temporal features and ignore the features based on spatial dimensions. Traditional gated recurrent unit (GRU) model performs well in processing time series data, and convolutional neural network (CNN) can obtain spatial characteristics from input data. Therefore, this paper introduces a hybrid GRU and CNN deep learning framework named GRU-Conv to fully leverage the advantages of both. Nevertheless, contrary to most previous GRU architectures, we retain the output information of all GRU units. So, the GRU-Conv model could extract crucial spatio-temporal features from EEG data. And more especially, the proposed model acquires the multi-dimensional features of multi-units after temporal processing in GRU and then uses CNN to extract spatial information from the temporal features. In this way, the EEG signals with different characteristics could be classified more accurately. Finally, the subject-independent experiment shows that our model has good performance on SEED and DEAP databases. The average accuracy of the former is 87.04%. The mean accuracy of the latter is 70.07% for arousal and 67.36% for valence.

Graphical abstract

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Ahern GL, Schwartz GE (1985) Differential lateralization for positive and negative emotion in the human brain: Eeg spectral analysis. Neuropsychologia 23(6):745–755. https://doi.org/10.1016/0028-3932(85)90081-8

    Article  Google Scholar 

  2. Al-Qazzaz NK, Sabir MK, Al-Timemy AH, Grammer K (2022) An integrated entropy-spatial framework for automatic gender recognition enhancement of emotion-based eegs. Med Biol Eng Comput 1–20. https://doi.org/10.1007/s11517-021-02452-5

  3. Alarcão SM, Fonseca MJ (2019) Emotions recognition using eeg signals: A survey. IEEE Trans Affect Comput 10(3):374–393. https://doi.org/10.1109/TAFFC.2017.2714671

    Article  Google Scholar 

  4. Arevalillo-Herráez M, Cobos M, Roger S, García-Pineda M (2019) Combining inter-subject modeling with a subject-based data transformation to improve affect recognition from eeg signals. Sensors 19(13):2999. https://doi.org/10.3390/s19132999

    Article  Google Scholar 

  5. Arjun, Rajpoot AS, Panicker MR (2022) Subject independent emotion recognition using eeg signals employing attention driven neural networks, Biomedical Signal Processing and Control 75:103547. https://doi.org/10.1016/j.bspc.2022.103547

  6. Bozhkov L, Koprinkova-Hristova P, Georgieva P (2017) Reservoir computing for emotion valence discrimination from eeg signals, Neurocomputing 231:28–40. Neural Systems in Distributed Computing and Artificial Intelligence. https://doi.org/10.1016/j.neucom.2016.03.108

  7. Chen J, Jiang D, Zhang Y (2019) A hierarchical bidirectional gru model with attention for eeg-based emotion classification. IEEE Access 7:118530–118540. https://doi.org/10.1109/ACCESS.2019.2936817

    Article  Google Scholar 

  8. Chung J, Gulcehre C, Cho K, Bengio Y (2014) Empirical evaluation of gated recurrent neural networks on sequence modeling. In: NIPS 2014 Workshop on Deep Learning, December 2014. https://doi.org/10.48550/arXiv.1412.3555

  9. Daimi SN, Saha G (2014) Classification of emotions induced by music videos and correlation with participants’ rating. Expert Syst Appl 41(13):6057–6065. https://doi.org/10.1016/j.eswa.2014.03.050

    Article  Google Scholar 

  10. Fernando B, Habrard A, Sebban M, Tuytelaars T (2013) Unsupervised visual domain adaptation using subspace alignment. In: Proceedings of the IEEE International Conference on Computer Vision (ICCV). https://doi.org/10.48550/arXiv.1409.5241

  11. Guo R, Li S, He L, Gao W, Qi H, Owens G (2013) Pervasive and unobtrusive emotion sensing for human mental health, In: 2013 7th International Conference on Pervasive Computing Technologies for Healthcare and Workshops, pp. 436–439. https://doi.org/10.4108/icst.pervasivehealth.2013.252133

  12. Hidalgo-Muñoz A, López M, Pereira A, Santos I, Tomé A (2013) Spectral turbulence measuring as feature extraction method from eeg on affective computing. Biomed Signal Process Control 8(6):945–950. https://doi.org/10.1016/j.bspc.2013.09.006

    Article  Google Scholar 

  13. Huang D, Chen S, Liu C, Zheng L, Tian Z, Jiang D (2021) Differences first in asymmetric brain: A bi-hemisphere discrepancy convolutional neural network for eeg emotion recognition. Neurocomputing 448:140–151. https://doi.org/10.1016/j.neucom.2021.03.105

    Article  Google Scholar 

  14. Khalil RA, Jones E, Babar MI, Jan T, Zafar MH, Alhussain T (2019) Speech emotion recognition using deep learning techniques: A review. IEEE Access 7:117327–117345. https://doi.org/10.1109/ACCESS.2019.2936124

    Article  Google Scholar 

  15. Kingma DP, Ba J (2014) Adam: A method for stochastic optimization. https://doi.org/10.48550/ARXIV.1412.6980

  16. Klem GH (1999) The ten-twenty electrode system of the international federation. the internanional federation of clinical nenrophysiology. Electroencephalogr Clin Neurophysiol Suppl 52:3–6. https://ci.nii.ac.jp/naid/10030008057/en/

  17. Koelstra S, Muhl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2012) Deap: A database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3(1):18–31. https://doi.org/10.1109/T-AFFC.2011.15

    Article  Google Scholar 

  18. Koelstra S, Muhl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2012) Deap: A database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3(1):18–31. https://doi.org/10.1109/T-AFFC.2011.15

    Article  Google Scholar 

  19. Krizhevsky A, Sutskever I, Hinton GE (2017) Imagenet classification with deep convolutional neural networks. Commun ACM 60(6):84–90. https://doi.org/10.1145/3065386

  20. Li C, Wang B, Zhang S, Liu Y, Song R, Cheng J, Chen X (2022) Emotion recognition from eeg based on multi-task learning with capsule network and attention mechanism. Comput Biol Med 143. https://doi.org/10.1016/j.compbiomed.2022.105303

    Article  Google Scholar 

  21. Li D, Wang Z, Wang C, Liu S, Chi W, Dong E, Song X, Gao Q, Song Y (2019) The fusion of electroencephalography and facial expression for continuous emotion recognition. IEEE Access 7:155724–155736. https://doi.org/10.1109/ACCESS.2019.2949707

    Article  Google Scholar 

  22. Li Y, Zheng W, Zong Y, Cui Z, Zhang T, Zhou X (2021) A bi-hemisphere domain adversarial neural network model for eeg emotion recognition. IEEE Trans Affect Comput 12(2):494–504. https://doi.org/10.1109/TAFFC.2018.2885474

    Article  Google Scholar 

  23. Pan SJ, Tsang IW, Kwok JT, Yang Q (2011) Domain adaptation via transfer component analysis. IEEE Trans Neural Netw 22(2):199–210. https://doi.org/10.1109/TNN.2010.2091281

    Article  Google Scholar 

  24. Rumelhart HGWRD (1986) Learning representations by back-propagating errors. Nature 323. https://doi.org/10.1038/323533a0

  25. Sammler D, Grigutsch M, Fritz T, Koelsch S (2007) Music and emotion: electrophysiological correlates of the processing of pleasant and unpleasant music. Psychophysiology 44(2):293–304. https://doi.org/10.1111/j.1469-8986.2007.00497.x

    Article  Google Scholar 

  26. Sepp Hochreiter JS (1997) Long short-term memory. Neural Comput 9(8):1735–1780. https://doi.org/10.1162/neco.1997.9.8.1735

    Article  Google Scholar 

  27. Shu L, Xie J, Yang M, Li Z, Li Z, Liao D, Xu X, Yang X (2018) A review of emotion recognition using physiological signals, Sensors 18(7). https://doi.org/10.3390/s18072074

  28. Song T, Zheng W, Liu S, Zong Y, Cui Z, Li Y (2021) Graph-embedded convolutional neural network for image-based eeg emotion recognition. IEEE Trans Emer Top Comput 1–1. https://doi.org/10.1109/TETC.2021.3087174

  29. Song T, Zheng W, Song P, Cui Z (2020) Eeg emotion recognition using dynamical graph convolutional neural networks. IEEE Trans Affect Comput 11(3):532–541. https://doi.org/10.1109/TAFFC.2018.2817622

    Article  Google Scholar 

  30. Wang D, Shang Y (2013) Modeling physiological data with deep belief networks. International journal of information and education technology (IJIET) 3(5):505. https://doi.org/10.7763/IJIET.2013.V3.326

    Article  Google Scholar 

  31. Wang F, Wu S, Zhang W, Xu Z, Zhang Y, Wu C, Coleman S (2020) Emotion recognition with convolutional neural network and eeg-based efdms. Neuropsychologia 146. https://doi.org/10.1016/j.neuropsychologia.2020.107506

    Article  Google Scholar 

  32. Weinreich A, Stephani T, Schubert T (2016) Emotion effects within frontal alpha oscillation in a picture oddball paradigm. Int J Psychophysiol 110:200–206. https://doi.org/10.1016/j.ijpsycho.2016.07.517

    Article  Google Scholar 

  33. Wen Z, Xu R, Du J (2017) A novel convolutional neural networks for emotion recognition based on eeg signal. In: 2017 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC), pp. 672–677. https://doi.org/10.1109/SPAC.2017.8304360

  34. Wu J (2017) Introduction to convolutional neural networks, National Key Lab for Novel Software Technology Nanjing University China 5(23):495. https://doi.org/10.13140/RG.2.2.11572.17282

  35. Xing X, Li Z, Xu T, Shu L, Hu B, Xu X (2019) Sae+lstm: A new framework for emotion recognition from multi-channel eeg. Front Neurorobot 13:37. https://doi.org/10.3389/fnbot.2019.00037

    Article  Google Scholar 

  36. Zhang T, Zheng W, Cui Z, Zong Y, Li Y (2019) Spatial-temporal recurrent neural network for emotion recognition. IEEE Trans Cybern 49(3):839–847. https://doi.org/10.1109/TCYB.2017.2788081

    Article  Google Scholar 

  37. Zheng WL, Lu BL (2015) Investigating critical frequency bands and channels for eeg-based emotion recognition with deep neural networks. IEEE Trans Auton Ment Dev 7(3):162–175. https://doi.org/10.1109/TAMD.2015.2431497

    Article  Google Scholar 

  38. Zhong P, Wang D, Miao C (2020) Eeg-based emotion recognition using regularized graph neural networks. IEEE Trans Affect Comput 1–1. https://doi.org/10.1109/TAFFC.2020.2994159

  39. Zhong X, Yin Z, Zhang J (2020) Cross-subject emotion recognition from eeg using convolutional neural networks. In: 2020 39th Chinese Control Conference (CCC), pp. 7516–7521. https://doi.org/10.23919/CCC50068.2020.9189559

  40. Zhou R, Zhang Z, Yang X, Fu H, Zhang L, Li L, Huang G, Dong Y, Li F, Liang Z (2022) A novel transfer learning framework with prototypical representation based pairwise learning for cross-subject cross-session eeg-based emotion recognition. arXiv:2202.06509

  41. Zhu JY, Zheng WL, Lu BL (2015) Cross-subject and cross-gender emotion classification from eeg. In: Jaffray DA (ed) World Congress on Medical Physics and Biomedical Engineering, June 7–12, 2015. Toronto, Canada, Springer International Publishing, Cham, pp 1188–1191. ISBN 978-3-319-19387-8. https://doi.org/10.1007/978-3-319-19387-8288

Download references

Funding

This study was supported by the National Network Science Foundation of China under Grant No. 62072468.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yanjiang Wang.

Ethics declarations

Conflict of interest

The authors declare no competing interests.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xu, G., Guo, W. & Wang, Y. Subject-independent EEG emotion recognition with hybrid spatio-temporal GRU-Conv architecture. Med Biol Eng Comput 61, 61–73 (2023). https://doi.org/10.1007/s11517-022-02686-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11517-022-02686-x

Keywords

Navigation