Skip to main content
Log in

Generator-based Domain Adaptation Method with Knowledge Free for Cross-subject EEG Emotion Recognition

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

Most existing approaches for cross-subject electroencephalogram (EEG) emotion recognition learn the universal features between different subjects with the neurological findings. The performance of these methods may be sub-optimal due to the inadequate investigation of the relationships between the brain and the emotion. Hence, in case of insufficient neurological findings, it is essential to develop a domain adaptation method for EEG data. In this paper, we propose a generator-based domain adaptation method with knowledge free (GDAKF) mechanism for the cross-subject EEG emotion recognition. Specifically, the feature distribution of the source domain is transformed into a feature distribution of the target domain via adversarial learning between the generator and the discriminator. Additionally, the transformation process is constrained by the EEG content regression loss and emotion information loss to maintain the emotional information during the feature alignment. To evaluate the effectiveness and performance of GDAKF, many experiments are carried out on the benchmark dataset, DEAP. The experimental result shows that GDAKF achieves excellent performance with 63.85% mean accuracy in low/high valence, which shows that the proposed method is comparable to the EEG cross-subject emotion recognition methods in the literature. This paper provides a novel idea for addressing cross-subject EEG emotion recognition, and it can also be applied to cross-session and cross-device emotion recognition tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Poria S, Hazarika D, Majumder N, Naik G, Cambria E, Mihalcea R. MELD: A multimodal multi-party dataset for emotion recognition in conversations. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Association for Computational Linguistics, Florence, Italy. 2019. p. 527–536. https://doi.org/10.18653/v1/P19-1050https://aclanthology.org/P19-1050.

  2. Zadeh A, Liang PP, Mazumder N, Poria S, Cambria E, Morency LP. Memory fusion network for multi-view sequential learning. Proceedings of the AAAI Conference on Artificial Intelligence. 2018;32(1). https://ojs.aaai.org/index.php/AAAI/article/view/12021.

  3. Chiong R, Budhi GS, Dhakal S. Combining sentiment lexicons and content-based features for depression detection. IEEE Intell Syst. 2021;36(6):99–105. https://doi.org/10.1109/MIS.2021.3093660.

    Article  Google Scholar 

  4. Huang H, Xie Q, Pan J, He Y, Wen Z, Yu R, Li Y. An EEG-based brain computer interface for emotion recognition and its application in patients with disorder of consciousness. IEEE Trans Affect Comput. 2019;12(4):832–42.

    Article  Google Scholar 

  5. Poria S, Cambria E, Bajpai R, Hussain A. A review of affective computing: From unimodal analysis to multimodal fusion. Information Fusion. 2017;37:98–125. https://doi.org/10.1016/j.inffus.2017.02.003https://www.sciencedirect.com/science/article/pii/S1566253517300738.

  6. Zadeh A, Liang PP, Poria S, Vij P, Cambria E, Morency LP. Multi-attention recurrent network for human communication comprehension. In: Thirty-Second AAAI Conference on Artificial Intelligence. 2018.

  7. Majumder N, Poria S, Peng H, Chhaya N, Cambria E, Gelbukh A. Sentiment and sarcasm classification with multitask learning. IEEE Intell Syst. 2019;34(3):38–43. https://doi.org/10.1109/MIS.2019.2904691.

    Article  Google Scholar 

  8. Stappen L, Baird A, Cambria E, Schuller BW. Sentiment analysis and topic recognition in video transcriptions. IEEE Intell Syst. 2021;36(2):88–95. https://doi.org/10.1109/MIS.2021.3062200.

    Article  Google Scholar 

  9. Jenke R, Peer A, Buss M. Feature extraction and selection for emotion recognition from eeg. IEEE Trans Affect Comput. 2014;5(3):327–39.

    Article  Google Scholar 

  10. van Noord K, Wang W, Jiao H. Insights of 3D input CNN in EEG-based emotion recognition. In: 2021 43rd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC). 2021. p. 212–215. https://doi.org/10.1109/EMBC46164.2021.9631042.

  11. Padhmashree V, Bhattacharyya A. Human emotion recognition based on time–frequency analysis of multivariate EEG signal. Knowl-Based Syst. 2022;238:107867.

  12. Cui H, Liu A, Zhang X, Chen X, Wang K, Chen X. EEG-based emotion recognition using an end-to-end regional-asymmetric convolutional neural network. Knowl-Based Syst. 2020;205:106243

  13. Zheng W. Multichannel EEG-based emotion recognition via group sparse canonical correlation analysis. IEEE Transactions on Cognitive and Developmental Systems. 2017;9(3):281–90. https://doi.org/10.1109/TCDS.2016.2587290.

    Article  Google Scholar 

  14. García-Martínez B, Martinez-Rodrigo A, Alcaraz R, Fernández-Caballero A. A review on nonlinear methods using electroencephalographic recordings for emotion recognition. IEEE Trans Affect Comput. 2019;12(3):801–20.

    Article  Google Scholar 

  15. Poria S, Hazarika D, Majumder N, Mihalcea R. Beneath the tip of the iceberg: Current challenges and new directions in sentiment analysis research. IEEE Trans Affect Comput. 2020;1–1. https://doi.org/10.1109/TAFFC.2020.3038167.

  16. Alarcao SM, Fonseca MJ. Emotions recognition using EEG signals: a survey. IEEE Trans Affect Comput. 2017;10(3):374–93.

    Article  Google Scholar 

  17. Liu ZT, Xie Q, Wu M, Cao WH, Li DY, Li SH. Electroencephalogram emotion recognition based on empirical mode decomposition and optimal feature selection. IEEE Transactions on Cognitive and Developmental Systems. 2018;11(4):517–26.

    Article  Google Scholar 

  18. Xiao G, Ma Y, Liu C, Jiang D. A machine emotion transfer model for intelligent human-machine interaction based on group division. Mech Syst Signal Process. 2020;142:106736.

  19. Bahari F, Janghorbani A. EEG-based emotion recognition using recurrence plot analysis and k nearest neighbor classifier. In: 2013 20th Iranian Conference on Biomedical Engineering (ICBME). IEEE; 2013. p. 228–233.

  20. Wu D, Xu Y, Lu BL. Transfer learning for EEG-based brain-computer interfaces: a review of progress made since 2016. IEEE Transactions on Cognitive and Developmental Systems. 2020.

  21. Lan Z, Sourina O, Wang L, Scherer R, Müller-Putz GR. Domain adaptation techniques for EEG-based emotion recognition: a comparative study on two public datasets. IEEE Transactions on Cognitive and Developmental Systems. 2018;11(1):85–94.

    Article  Google Scholar 

  22. Zheng WL, Zhu JY, Lu BL. Identifying stable patterns over time for emotion recognition from EEG. IEEE Trans Affect Comput. 2017;10(3):417–29.

    Article  Google Scholar 

  23. Yang F, Zhao X, Jiang W, Gao P, Liu G. Multi-method fusion of cross-subject emotion recognition based on high-dimensional EEG features. Front Comput Neurosci. 2019;53.

  24. Li Y, Wang L, Zheng W, Zong Y, Qi L, Cui Z, Zhang T, Song T. A novel bi-hemispheric discrepancy model for EEG emotion recognition. IEEE Transactions on Cognitive and Developmental Systems. 2020;13(2):354–67.

    Article  Google Scholar 

  25. Dimond SJ, Farrington L, Johnson P. Differing emotional response from right and left hemispheres. Nature. 1976;261(5562):690–2.

    Article  Google Scholar 

  26. Zheng WL, Liu W, Lu Y, Lu BL, Cichocki A. Emotionmeter: a multimodal framework for recognizing human emotions. IEEE Transactions on Cybernetics. 2018;49(3):1110–22.

    Article  Google Scholar 

  27. Li Y, Zheng W, Wang L, Zong Y, Cui Z. From regional to global brain: a novel hierarchical spatial-temporal neural network model for EEG emotion recognition. IEEE Trans Affect Comput. 2019.

  28. Li J, Qiu S, Du C, Wang Y, He H. Domain adaptation for EEG emotion recognition based on latent representation similarity. IEEE Transactions on Cognitive and Developmental Systems. 2019;12(2):344–353.

  29. Ganin Y, Ustinova E, Ajakan H, Germain, et al. Domain-adversarial training of neural networks. J Mach Learn Res. 2016;17(1):2096–2030.

  30. Shanechi MM. Brain-machine interfaces from motor to mood. Nat Neurosci. 2019;22(10):1554–64.

    Article  Google Scholar 

  31. Hinton GE, Salakhutdinov RR. Reducing the dimensionality of data with neural networks. Science. 2006;313(5786):504–507.

  32. Vincent P, Larochelle H, Bengio Y, Manzagol PA. Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning. 2008. p. 1096–1103.

  33. Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, et al. Generative adversarial networks. Commun ACM. 2020;63(11):139–44.

    Article  MathSciNet  Google Scholar 

  34. Zhang K, Li Y, Wang J, Cambria E, Li X. Real-time video emotion recognition based on reinforcement learning and domain knowledge. IEEE Trans Circuits Syst Video Technol. 2021;1–1. https://doi.org/10.1109/TCSVT.2021.3072412.

  35. Tu G, Wen J, Liu C, Jiang D, Cambria E. Context- and sentiment-aware networks for emotion recognition in conversation. IEEE Transactions on Artificial Intelligence. 2022;1–1. https://doi.org/10.1109/TAI.2022.3149234.

  36. Valdivia A, Luzón MV, Cambria E, Herrera F. Consensus vote models for detecting and filtering neutrality in sentiment analysis. Information Fusion. 2018;44:126–35.

    Article  Google Scholar 

  37. Liang B, Su H, Gui L, Cambria E, Xu R. Aspect-based sentiment analysis via affective knowledge enhanced graph convolutional networks. Knowl-Based Syst. 2022;235;107643.

  38. Wang Z, Ho SB, Cambria E. Multi-level fine-scaled sentiment sensing with ambivalence handling. Internat J Uncertain Fuzziness Knowledge-Based Systems. 2020;28(04):683–97.

    Article  Google Scholar 

  39. Li W, Shao W, Ji S, Cambria E. Bieru: Bidirectional emotional recurrent unit for conversational sentiment analysis. Neurocomputing. 2022;467:73–82.

    Article  Google Scholar 

  40. Koelstra S, Muhl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, et al. Deap: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput. 2011;3(1):18–31.

    Article  Google Scholar 

  41. Becker H, Fleureau J, Guillotel P, Wendling F, Merlet I, Albera L. Emotion recognition based on high-resolution EEG recordings and reconstructed brain sources. IEEE Trans Affect Comput. 2017;11(2):244–57.

    Article  Google Scholar 

  42. Patil A, Deshmukh C, Panat A. Feature extraction of EEG for emotion recognition using Hjorth features and higher order crossings. In: 2016 Conference on Advances in Signal Processing (CASP). IEEE; 2016. p. 429–434.

  43. Zhao LM, Li R, Zheng WL, Lu BL. Classification of five emotions from EEG and eye movement signals: complementary representation properties. In: 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER). IEEE; 2019. p. 611–614.

  44. Xing X, Li Z, Xu T, Shu L, Hu B, Xu X. SAE+ LSTM: A new framework for emotion recognition from multi-channel EEG. Front Neurorobot. 2019;13:37.

    Article  Google Scholar 

  45. Murugappan M, Ramachandran N, Sazali Y, et al. Classification of human emotion from EEG using discrete wavelet transform. J Biomed Sci Eng. 2010;3(04):390.

    Article  Google Scholar 

  46. Tao W, Li C, Song R, Cheng J, Liu Y, Wan F, Chen X. EEG-based Emotion Recognition via Channel-wise Attention and Self Attention. IEEE Trans Affect Comput. 1–1. https://doi.org/10.1109/TAFFC.2020.3025777.

  47. Poria S, Majumder N, Hazarika D, Cambria E, Gelbukh A, Hussain A. Multimodal sentiment analysis: addressing key issues and setting up the baselines. IEEE Intell Syst. 2018;33(6):17–25. https://doi.org/10.1109/MIS.2018.2882362.

    Article  Google Scholar 

  48. Cambria E, Howard N, Hsu J, Hussain A. Sentic blending: Scalable multimodal fusion for the continuous interpretation of semantics and sentics. In: 2013 IEEE symposium on computational intelligence for human-like intelligence (CIHLI). IEEE; 2013. p. 108–117.

  49. Kim BH, Jo S. Deep physiological affect network for the recognition of human emotions. IEEE Trans Affect Comput. 2018;11(2):230–43.

    Google Scholar 

  50. Wu X, Zheng WL, Li Z, Lu BL. Investigating EEG-based functional connectivity patterns for multimodal emotion recognition. J Neural Eng. 2022;19(1):016012.

  51. Stappen L, Schumann L, Sertolli B, Baird A, Weigell B, Cambria E, Schuller BW. MuSe-toolbox: the multimodal sentiment analysis continuous annotation fusion and discrete class transformation toolbox. Association for Computing Machinery, New York, NY, USA; 2021. p. 75–82. https://doi.org/10.1145/3475957.3484451.

  52. Cimtay Y, Ekmekcioglu E. Investigating the use of pretrained convolutional neural network on cross-subject and cross-dataset eeg emotion recognition. Sensors. 2020;20(7):2034.

    Article  Google Scholar 

  53. Song T, Zheng W, Song P, Cui Z. EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Trans Affect Comput. 2018;11(3):532–41.

    Article  Google Scholar 

  54. Li Y, Zheng W, Zong Y, Cui Z, Zhang T, Zhou X. A bi-hemisphere domain adversarial neural network model for EEG emotion recognition. IEEE Trans Affect Comput. 2018;12(2):494–504.

    Article  Google Scholar 

  55. Yang Y, Wu Q, Qiu M, Wang Y, Chen X. Emotion recognition from multi-channel EEG through parallel convolutional recurrent neural network. In: 2018 International Joint Conference on Neural Networks (IJCNN). 2018. p. 1–7. https://doi.org/10.1109/IJCNN.2018.8489331.

  56. Huang D, Chen S, Liu C, Zheng L, Tian Z, Jiang D. Differences first in asymmetric brain: A bi-hemisphere discrepancy convolutional neural network for EEG emotion recognition. Neurocomputing. 2021;448:140–51.

    Article  Google Scholar 

  57. Vivaldi EA, Bassi A. Frequency domain analysis of sleep EEG for visualization and automated state detection. In: 2006 International Conference of the IEEE Engineering in Medicine and Biology Society. 2006. p. 3740–3743. https://doi.org/10.1109/IEMBS.2006.259546.

  58. Meng J, Yao L, Sheng X, Zhang D, Zhu X. Simultaneously optimizing spatial spectral features based on mutual information for EEG classification. IEEE Trans Biomed Eng. 2014;62(1):227–40.

    Article  Google Scholar 

  59. Daly JJ, Huggins JE. Brain-computer interface: current and emerging rehabilitation applications. Arch Phys Med Rehabil. 2015;96(3):S1–7.

    Article  Google Scholar 

  60. Zhang Y, Zhou G, Jin J, Wang M, Wang X, Cichocki A. L1-regularized multiway canonical correlation analysis for SSVEP-based BCI. IEEE Trans Neural Syst Rehabil Eng. 2013;21(6):887–96.

    Article  Google Scholar 

  61. Al-Nafjan A, Hosny M, Al-Ohali Y, Al-Wabil A. Review and classification of emotion recognition based on EEG brain-computer interface system research: a systematic review. Appl Sci. 2017;7(12):1239.

    Article  Google Scholar 

  62. Liu Y, Sourina O. Real-time fractal-based valence level recognition from EEG. In: Transactions on Computational Science XVIII. Springer; 2013. p. 101–120.

  63. Petrantonakis PC, Hadjileontiadis LJ. Emotion recognition from EEG using higher order crossings. IEEE Trans Inf Technol Biomed. 2009;14(2):186–97.

    Article  Google Scholar 

  64. Mehmood RM, Lee HJ. Towards emotion recognition of EEG brain signals using Hjorth parameters and SVM. Adv Sci Technol Lett Biosci Med Res. 2015a;91:24–27

  65. Mehmood RM, Lee HJ. EEG based emotion recognition from human brain using Hjorth parameters and SVM. Int J Bio-Sci Bio-Technol. 2015b;7(3):23–32.

  66. Duan RN, Wang XW, Lu BL. EEG-based emotion recognition in listening music by using support vector machine and linear dynamic system. In: International Conference on Neural Information Processing. Springer; 2012. p. 468–475.

  67. Duan RN, Zhu JY, Lu BL. Differential entropy feature for EEG-based emotion classification. In: 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER). IEEE; 2013. p 81–84.

  68. D’mello SK, Kory J. A review and meta-analysis of multimodal affect detection systems. ACM Computing Surveys (CSUR). 2015;47(3):1–36.

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to respect and thank all reviewers for their constructive and helpful review.

Funding

This research is funded by the National Natural Science Foundation of China (62106136, 61902231), Natural Science Foundation of Guangdong Province (2019A1515010943), The Basic and Applied Basic Research of Colleges and Universities in Guangdong Province (Special Projects in Artificial Intelligence)(2019KZDZX1030), 2020 Li Ka Shing Foundation Cross-Disciplinary Research Grant (2020LKSFG04D), and Science and Technology Major Project of Guangdong Province (STKJ2021005).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dazhi Jiang.

Ethics declarations

Ethical Approval

This article does not contain any studies with human participants performed by any of the authors.

Informed Consent

Informed consent was not required as no human or animals were involved.

Conflicts of Interest

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Huang, D., Zhou, S. & Jiang, D. Generator-based Domain Adaptation Method with Knowledge Free for Cross-subject EEG Emotion Recognition. Cogn Comput 14, 1316–1327 (2022). https://doi.org/10.1007/s12559-022-10016-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-022-10016-4

Keywords

Navigation