Skip to main content

BiSMSM: A Hybrid MLP-Based Model of Global Self-Attention Processes for EEG-Based Emotion Recognition

  • Conference paper
  • First Online:
Book cover Artificial Neural Networks and Machine Learning – ICANN 2022 (ICANN 2022)

Abstract

Due to the instability and complex distribution of electroencephalography (EEG) signals and the great cross-subject variations, extracting valuable and discriminative emotional information from EEG is still a significant challenge in EEG-based emotion recognition. In this paper, we proposed Bi-Stream MLP-SA Mixer (BiSMSM), a novel model for emotion recognition, which consists of two streams: the Spatial stream and the Temporal stream. The model captures signal information from four angles, from space to time, from local to global, aiming to encode more discriminative features describing emotions. The Spatial stream focuses on spatial information, while the Temporal stream concentrates on the correlation in the time domain. The structures of the two streams are similar, which both consist of an MLP-based module that extracts regional in-channel and cross-channel information. The module is followed by a global self-attention mechanism to focus on the global signal correlations. We conduct subject-independent experiments on the datasets DEAP and DREAMER to verify the performance of our model, whose results have excelled related methods. We obtained the average accuracy of 62.97\(\%\) for valence classification and 61.87\(\%\) for arousal classification on DEAP, and 60.87\(\%\) for valence and 63.28\(\%\) for arousal on DREAMER.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Abbaschian, B.J., Sierra-Sosa, D., Elmaghraby, A.: Deep learning techniques for speech emotion recognition, from databases to models. Sensors 21(4), 1249 (2021)

    Article  Google Scholar 

  2. Al-Shargie, F., Tariq, U., Alex, M., Mir, H., Al-Nashash, H.: Emotion recognition based on fusion of local cortical activations and dynamic functional networks connectivity: an EEG study. IEEE Access 7, 143550–143562 (2019)

    Article  Google Scholar 

  3. Alarcão, S.M., Fonseca, M.J.: Emotions recognition using EEG signals: a survey. IEEE Trans. Affect. Comput. 10(3), 374–393 (2019)

    Article  Google Scholar 

  4. Dosovitskiy, A., et al.: An image is worth 16\(\times \)16 words: transformers for image recognition at scale. In: 9th International Conference on Learning Representations, ICLR 2021, Virtual Event, Austria, 3–7 May 2021. OpenReview.net (2021)

    Google Scholar 

  5. He, Z., Zhong, Y., Pan, J.: Joint temporal convolutional networks and adversarial discriminative domain adaptation for EEG-based cross-subject emotion recognition. In: ICASSP 2022–2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 3214–3218 (2022)

    Google Scholar 

  6. Hochreiter, S.: The vanishing gradient problem during learning recurrent neural nets and problem solutions. Internat. J. Uncertain. Fuzziness Knowl.-Based Syst. 6(02), 107–116 (1998)

    Article  MATH  Google Scholar 

  7. Jain, D.K., Shamsolmoali, P., Sehdev, P.: Extended deep neural network for facial emotion recognition. Pattern Recogn. Lett. 120, 69–74 (2019)

    Article  Google Scholar 

  8. Jenke, R., Peer, A., Buss, M.: Feature extraction and selection for emotion recognition from EEG. IEEE Trans. Affect. Comput. 5(3), 327–339 (2014)

    Article  Google Scholar 

  9. Jia, Z., Lin, Y., Cai, X., Chen, H., Gou, H., Wang, J.: SST-EmotionNet: spatial-spectral-temporal based attention 3D dense network for EEG emotion recognition. In: Proceedings of the 28th ACM International Conference on Multimedia, pp. 2909–2917 (2020)

    Google Scholar 

  10. Katsigiannis, S., Ramzan, N.: DREAMER: a database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices. IEEE J. Biomed. Health Inform. 22(1), 98–107 (2018)

    Article  Google Scholar 

  11. Khare, S., Nishad, A., Upadhyay, A., Bajaj, V.: Classification of emotions from EEG signals using time-order representation based on the S-transform and convolutional neural network. Electron. Lett. 56(25), 1359–1361 (2020)

    Article  Google Scholar 

  12. Kim, K.H., Bang, S.W., Kim, S.R.: Emotion recognition system using short-term monitoring of physiological signals. Med. Biol. Eng. Comput. 42(3), 419–427 (2004)

    Article  Google Scholar 

  13. Koelstra, S., et al.: DEAP: a database for emotion analysis using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012)

    Article  Google Scholar 

  14. Li, J., Qiu, S., Du, C., Wang, Y., He, H.: Domain adaptation for EEG emotion recognition based on latent representation similarity. IEEE Trans. Cogn. Dev. Syst. 12(2), 344–353 (2020)

    Article  Google Scholar 

  15. Li, P., et al.: EEG based emotion recognition by combining functional connectivity network and local activations. IEEE Trans. Biomed. Eng. 66(10), 2869–2881 (2019)

    Article  Google Scholar 

  16. Li, W., Huan, W., Hou, B., Tian, Y., Zhang, Z., Song, A.: Can emotion be transferred?-a review on transfer learning for EEG-based emotion recognition. IEEE Trans. Cogn. Dev. Syst. (2021). https://doi.org/10.1109/TCDS.2021.3098842

  17. Priyasad, D., Fernando, T., Denman, S., Sridharan, S., Fookes, C.: Affect recognition from scalp-EEG using channel-wise encoder networks coupled with geometric deep learning and multi-channel feature fusion. Knowl.-Based Syst. 250, 109038 (2022)

    Article  Google Scholar 

  18. Rached, T.S., Perkusich, A.: Emotion recognition based on brain-computer interface systems. In: Brain-Computer Interface Systems-recent Progress and Future Prospects, pp. 253–270 (2013)

    Google Scholar 

  19. Song, T., Zheng, W., Liu, S., Zong, Y., Cui, Z., Li, Y.: Graph-embedded convolutional neural network for image-based EEG emotion recognition. IEEE Trans. Emerg. Top. Comput. 1 (2021). https://doi.org/10.1109/TETC.2021.3087174

  20. Song, T., Zheng, W., Song, P., Cui, Z.: EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Trans. Affect. Comput. 11(3), 532–541 (2020)

    Article  Google Scholar 

  21. Tao, W., et al.: EEG-based emotion recognition via channel-wise attention and self attention. IEEE Trans. Affect. Comput. 1–12 (2020). https://doi.org/10.1109/TAFFC.2020.3025777

  22. Tolstikhin, H., et al.: MLP-mixer: an all-MLP architecture for vision. Adv. Neural Inf. Process. Syst. 34, 24261–24272 (2021)

    Google Scholar 

  23. Vaswani, A., et al.: Attention is all you need. Adv. Neural Inf. Process. Syst. 30, 6000–6010 (2017)

    Google Scholar 

  24. Wang, Z., Tong, Y., Heng, X.: Phase-locking value based graph convolutional neural networks for emotion recognition. IEEE Access 7, 93711–93722 (2019)

    Article  Google Scholar 

  25. Xing, X., Li, Z., Xu, T., Shu, L., Hu, B., Xu, X.: SAE+LSTM: a new framework for emotion recognition from multi-channel EEG. Front. Neurorobot. 13, 37(1)–37(14) (2019)

    Google Scholar 

  26. Yang, Z., Kay, A., Li, Y., Cross, W., Luo, J.: Pose-based body language recognition for emotion and psychiatric symptom interpretation. In: 2020 25th International Conference on Pattern Recognition (ICPR), pp. 294–301. IEEE, Milan (2021)

    Google Scholar 

  27. Zhang, T., Zheng, W., Cui, Z., Zong, Y., Li, Y.: Spatial-temporal recurrent neural network for emotion recognition. IEEE Trans. Cybern. 49(3), 839–847 (2019)

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported in part by the Aeronautical Science Foundation of China under Grant 20200058069001, in part by the Basic Research Project of Leading Technology of Jiangsu Province under Grant BK20192004, and in part by the Fundamental Research Funds for the Central Universities under Grant 2242021R41094.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wei Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, W., Tian, Y., Hou, B., Dong, J., Shao, S. (2022). BiSMSM: A Hybrid MLP-Based Model of Global Self-Attention Processes for EEG-Based Emotion Recognition. In: Pimenidis, E., Angelov, P., Jayne, C., Papaleonidas, A., Aydin, M. (eds) Artificial Neural Networks and Machine Learning – ICANN 2022. ICANN 2022. Lecture Notes in Computer Science, vol 13529. Springer, Cham. https://doi.org/10.1007/978-3-031-15919-0_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-15919-0_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-15918-3

  • Online ISBN: 978-3-031-15919-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics