Skip to main content

Advertisement

Log in

DDNet: a hybrid network based on deep adaptive multi-head attention and dynamic graph convolution for EEG emotion recognition

  • Original Paper
  • Published:
Signal, Image and Video Processing Aims and scope Submit manuscript

Abstract

Emotion recognition plays a crucial role in cognitive science and human-computer interaction. Existing techniques tend to ignore the significant differences between different subjects, resulting in limited accuracy and generalization ability. In addition, existing methods suffer from difficulties in capturing the complex relationships among the channels of electroencephalography signals. A hybrid network is proposed to overcome the limitations. The proposed network is comprised of a deep adaptive multi-head attention (DAM) branch and a dynamic graph convolution (DGC) branch. The DAM branch uses residual convolution and adaptive multi-head attention mechanism. It can focus on multi-dimensional information from different representational subspaces at different locations. The DGC branch uses a dynamic graph convolutional neural network that learns topological features among the channels. The synergistic effect of these two branches enhances the model’s adaptability to subject differences. The extraction of local features and the understanding of global patterns are also optimized in the proposed network. Subject independent experiments were conducted on SEED and SEED-IV datasets. The average accuracy of SEED was 92.63% and the average F1-score was 92.43%. The average accuracy of SEED-IV was 85.03%, and the average F1-score was 85.01%. The results show that the proposed network has significant advantages in cross-subject emotion recognition, and can improve the accuracy and generalization ability in emotion recognition tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Wahdow, M., et al.: Multi frequency band fusion method for EEG signal classification. Signal Image Video Process. 17, 1883–1887 (2022)

    MATH  Google Scholar 

  2. Zhang, Y., Zhang, Y., Wang, S.: An attention-based hybrid deep learning model for EEG emotion recognition. Signal Image Video Process. 17, 2305–2313 (2022)

    MATH  Google Scholar 

  3. Hossain, M.S., Muhammad, G.: Emotion recognition using deep learning approach from audio-visual emotional big data. Inf. Fus. 49, 69–78 (2019)

    MATH  Google Scholar 

  4. Alarcão, S.M., Fonseca, M.J.: Emotions recognition using EEG signals: a survey. IEEE Trans. Affect. Comput. 10, 374–393 (2019)

    MATH  Google Scholar 

  5. Abbasi, H., Seyedarabi, H., Razavi, S.N.: A combinational deep learning approach for automated visual classification using EEG signals. Signal Image Video Process. 18, 2453–2464 (2023)

    MATH  Google Scholar 

  6. Gong, P., Ma, H.T., Wang, Y.: Emotion recognition based on the multiple physiological signals. 2016 IEEE International Conference on Real-time Computing and Robotics (RCAR) 140–143 (2016)

  7. Li, Y., Guo, W., Wang, Y.: Emotion recognition with attention mechanism-guided dual-feature multi-path interaction network. Signal Image Video Process (2024)

  8. Chen, H., et al.: Ms-mda: Multisource marginal distribution adaptation for cross-subject and cross-session EEG emotion recognition. Front Neurosci 15 (2021)

  9. Li, T., Fu, B., Wu, Z., Liu, Y.: EEG-based emotion recognition using spatial-temporal-connective features via multi-scale CNN. IEEE Access 11, 41859–41867 (2023)

    Google Scholar 

  10. Bian, D., et al.: Deep-learning-based motor imagery EEG classification by exploiting the functional connectivity of cortical source imaging. Signal Image Video Process 18, 2991–3007 (2024)

    MATH  Google Scholar 

  11. Liu, P., Zhang, H., Lian, W., Zuo, W.: Multi-level wavelet convolutional neural networks. IEEE Access 7, 74973–74985 (2019)

    MATH  Google Scholar 

  12. Li, M., Qiu, M., Zhu, L., Kong, W.: Feature hypergraph representation learning on spatial-temporal correlations for EEG emotion recognition. Cogn. Neurodyn. 17, 1271–1281 (2022)

    MATH  Google Scholar 

  13. Shen, L., et al.: Multiscale temporal self-attention and dynamical graph convolution hybrid network for EEG-based stereogram recognition. IEEE Trans. Neural Syst. Rehabil. Eng. 30, 1191–1202 (2022)

    MATH  Google Scholar 

  14. Klepl, D., Wu, M., He, F.: Graph neural network-based EEG classification: a survey. IEEE Trans. Neural Syst. Rehabil. Eng. 32, 493–503 (2024)

    MATH  Google Scholar 

  15. Ye, M., Chen, C.L.P., Zhang, T.: Hierarchical dynamic graph convolutional network with interpretability for EEG-based emotion recognition. IEEE Trans. Neural Netw. Learn. Syst. 1–12 (2022)

  16. Song, T., Zheng, W., Song, P., Cui, Z.: EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Trans. Affect. Comput. 11, 532–541 (2020)

    MATH  Google Scholar 

  17. Zhong, P., Wang, D., Miao, C.: EEG-based emotion recognition using regularized graph neural networks. IEEE Trans. Affect. Comput. 13, 1290–1301 (2022)

    MATH  Google Scholar 

  18. Zeng, H., et al.: Siam-GCAN: a siamese graph convolutional attention network for EEG emotion recognition. IEEE Trans. Instrum. Meas. 71, 1–9 (2022)

    MATH  Google Scholar 

  19. Bagherzadeh, S., et al.: Developing an EEG-based emotion recognition using ensemble deep learning methods and fusion of brain effective connectivity maps. IEEE Access 12, 50949–50965 (2024)

    MATH  Google Scholar 

  20. Bagherzadeh, S., Maghooli, K., Shalbaf, A., Maghsoudi, A.: A hybrid EEG-based emotion recognition approach using wavelet convolutional neural networks and support vector machine. Basic Clin. Neurosci. 14, 87–102 (2023)

  21. Hwang, S., Hong, K., Son, G., Byun, H.: Learning CNN features from de features for EEG-based emotion recognition. Pattern Anal. Appl. 23, 1323–1335 (2019)

    MATH  Google Scholar 

  22. Li, C., et al.: EEG-based emotion recognition via transformer neural architecture search. IEEE Trans. Ind. Inf. 19, 6016–6025 (2023)

    MATH  Google Scholar 

  23. Guo, Y., Zhang, T., Huang, W.: Emotion recognition based on multi-modal electrophysiology multi-head attention contrastive learning. arXiv:2308.01919 (2023)

  24. Li, M., Li, J., Zheng, X., Ge, J., Xu, G.: Mshanet: a multi-scale residual network with hybrid attention for motor imagery eeg decoding. Cogn. Neurodyn. (2024)

  25. Wang, H., Xu, L., Bezerianos, A., Chen, C., Zhang, Z.: Linking attention-based multiscale CNN with dynamical GCN for driving fatigue detection. IEEE Trans. Instrum. Meas. 70, 1–11 (2021)

    Google Scholar 

  26. Wu, F., et al.: Semi-supervised multi-view graph convolutional networks with application to webpage classification. Inf. Sci. 591, 142–154 (2022)

    MathSciNet  MATH  Google Scholar 

  27. Zheng, W.-L., Lu, B.-L.: Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans. Auton. Mental Dev. 7, 162–175 (2015)

    MATH  Google Scholar 

  28. Zheng, W.-L., Liu, W., Lu, Y., Lu, B.-L., Cichocki, A.: Emotionmeter: a multimodal framework for recognizing human emotions. IEEE Trans. Cybern. 49, 1110–1122 (2019)

    MATH  Google Scholar 

  29. Li, Y., et al.: A bi-hemisphere domain adversarial neural network model for EEG emotion recognition. IEEE Trans. Affect. Comput. 12, 494–504 (2021)

    MATH  Google Scholar 

  30. Zhou, R., et al.: Pr-pl: a novel prototypical representation based pairwise learning framework for emotion recognition using EEG signals. IEEE Trans. Affect. Comput. 15, 657–670 (2024)

    MATH  Google Scholar 

  31. Li, Y., et al.: GMSS: Graph-based multi-task self-supervised learning for EEG emotion recognition. IEEE Trans. Affect. Comput. 14, 2512–2525 (2023)

    MATH  Google Scholar 

  32. Bagherzadeh, S., et al.: A subject-independent portable emotion recognition system using synchrosqueezing wavelet transform maps of EEG signals and resnet-18. Biomed. Signal Process. Control. 90, 105875 (2024)

    MATH  Google Scholar 

  33. Guo, W., Li, Y., Liu, M., Ma, R., Wang, Y.: Functional connectivity-enhanced feature-grouped attention network for cross-subject EEG emotion recognition. Knowl. Based Syst. 283, 111199 (2024)

    MATH  Google Scholar 

  34. Zheng, W.-L., Zhu, J.-Y., Lu, B.-L.: Identifying stable patterns over time for emotion recognition from EEG. IEEE Trans. Affect. Comput. 10, 417–429 (2019)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Contributions

Bingyue Xu conducted data analysis and wrote the manuscript text. Xin Zhang reviewed and revised the initial draft. Xiu Zhang conducted supervision and guidance on research topics. Baiwei Sun and Yujie Wang worked in supervision. All authors have reviewed the manuscript.

Corresponding author

Correspondence to Xin Zhang.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xu, B., Zhang, X., Zhang, X. et al. DDNet: a hybrid network based on deep adaptive multi-head attention and dynamic graph convolution for EEG emotion recognition. SIViP 19, 293 (2025). https://doi.org/10.1007/s11760-025-03876-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11760-025-03876-4

Keywords