Skip to main content
Log in

Time series-dependent feature of EEG signals for improved visually evoked emotion classification using EmotionCapsNet

  • S.I.: Deep Learning for Time Series Data
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In recent studies, machine learning and deep learning strategies have been explored in many EEG-based application for best performance. More specifically, convolutional neural networks (CNNs) have demonstrated incredible capacity in electroencephalograph (EEG)-evoked emotion classification tasks. In preexisting case, CNN-based emotion classification techniques using EEG signals mostly involve a moderately intricate phase of feature extrication before any network model implementation. The CNNs are not able to well describe the natural interrelation among the various EEG channels, which basically provides essential data for the classification of different emotion states. In this paper, an efficacious and advanced version of CNN called Emotion-based Capsule Network (EmotionCapsNet) for multi-channel EEG-based emotion classification to achieve better classification accuracy is presented. EmotionCapsNet has been applied to the raw EEG signals as well as 2D image representation generated from EEG signals which can extricate descriptive and complex features from the EEG signals and decide the different emotional states. The proposed system is then compared with the other conventional machine learning and deep learning-based CNN model. Our strategy accomplishes an average accuracy of 77.50%, 78.44% and 79.38% for valence, arousal and dominance on the DEAP, 79.06%, 78.90% and 79.69% on AMIGOS and attains an average accuracy of 80.34%, 83.04% and 82.50% for valence, arousal and dominance on the DREAMER, respectively. These outcomes demonstrate that adapted strategy yields comparable precision on raw EEG signal and it also provides better classification results on spatiotemporal feature of EEG signal for emotion classification task.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Dolan RJ (2002) Emotion, cognition, and behavior. Science 298(5596):1191–1194

    Article  Google Scholar 

  2. Al-Nafjan A, Hosny M, Al-Ohali Y, Al-Wabil A (2017) Review and classification of emotion recognition based on EEG brain-computer interface system research: a systematic review. Appl Sci 7(12):1239

    Article  Google Scholar 

  3. Van den Broek EL (2013) Ubiquitous emotion-aware computing. Pers Ubiquitous Comput 17(1):53–67

    Article  Google Scholar 

  4. Malandrakis N, Potamianos A, Evangelopoulos G, Zlatintsi A (2011) A supervised approach to movie emotion tracking. In: 2011 IEEE international conference on acoustics, speech and signal processing (ICASSP). IEEE, pp 2376–2379

  5. Aslam AR, Altaf MAB (2019) An 8 channel patient specific neuromorphic processor for the early screening of autistic children through emotion detection. In: 2019 IEEE international symposium on circuits and systems (ISCAS). IEEE, pp 1–5

  6. Liu Y, Sourina O, Nguyen MK (2011) Real-time EEG-based emotion recognition and its applications. In: Transactions on computational science XII. Springer, pp 256–277

  7. Lin YP, Wang CH, Jung TP, Wu TL, Jeng SK, Duann JR, Chen JH (2010) EEG-based emotion recognition in music listening. IEEE Trans Biomed Eng 57(7):1798–1806

    Article  Google Scholar 

  8. Yang Y, Wu Q, Fu Y, Chen X (2018) Continuous convolutional neural network with 3d input for EEG-based emotion recognition. In: International conference on neural information processing. Springer, pp 433–443

  9. Wu W, Yin Y, Wang X, Xu D (2018) Face detection with different scales based on faster r-CNN. IEEE Trans Cybern 49(11):4017–4028

    Article  Google Scholar 

  10. Zheng WL, Zhu JY, Peng Y, Lu BL (2014) EEG-based emotion classification using deep belief networks. In: 2014 IEEE international conference on multimedia and expo (ICME). IEEE, pp 1–6

  11. Sun B, Wei Q, Li L, Xu Q, He J, Yu L (2016) LSTM for dynamic emotion and group emotion recognition in the wild. In: Proceedings of the 18th ACM international conference on multimodal interaction, pp 451–457

  12. Chao H, Dong L, Liu Y, Lu B (2019) Emotion recognition from multiband EEG signals using CapsNet. Sensors 19(9):2212

    Article  Google Scholar 

  13. Zhou W, Liu J, Lei J, Yu L, Hwang JN (2021) Gmnet: graded-feature multilabel-learning network for RGB-thermal urban scene semantic segmentation. IEEE Trans Image Process 30:7790–7802

    Article  Google Scholar 

  14. Ding L, Huang L, Li S, Gao H, Deng H, Li Y, Liu G (2020) Definition and application of variable resistance coefficient for wheeled mobile robots on deformable terrain. IEEE Trans Robot 36(3):894–909

    Article  Google Scholar 

  15. He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778

  16. Yang H, Han J, Min K (2019) A multi-column CNN model for emotion recognition from EEG signals. Sensors 19(21):4736

    Article  Google Scholar 

  17. Thammasan N, Moriyama K, Ki Fukui, Numao M (2016) Continuous music-emotion recognition based on electroencephalogram. IEICE Trans Inf Syst 99(4):1234–1241

    Article  Google Scholar 

  18. Estepp JR, Christensen JC (2015) Electrode replacement does not affect classification accuracy in dual-session use of a passive brain-computer interface for assessing cognitive workload. Front Neurosci 9:54

    Article  Google Scholar 

  19. Zheng WL, Lu BL (2015) Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans Auton Mental Dev 7(3):162–175

    Article  Google Scholar 

  20. Naji M, Firoozabadi M, Azadfallah P (2015) Emotion classification during music listening from forehead biosignals. Signal Image Video Process 9(6):1365–1375

    Article  Google Scholar 

  21. Liu F, Zhang G, Lu J (2020) Multi-source heterogeneous unsupervised domain adaptation via fuzzy-relation neural networks. IEEE Trans Fuzzy Syst

  22. Dong J, Cong Y, Sun G, Fang Z, Ding Z (2021) Where and how to transfer: knowledge aggregation-induced transferability perception for unsupervised domain adaptation. IEEE Trans Pattern Anal Mach Intell

  23. Sabour S, Frosst N, Hinton GE (2017) Dynamic routing between capsules. In: Advances in neural information processing systems, pp 3856–3866

  24. Wang Y, Sun A, Huang M, Zhu X (2019) Aspect-level sentiment analysis using as-capsules. In: The world wide web conference, pp 2033–2044

  25. Turan MAT, Erzin E (2018) Monitoring infant’s emotional cry in domestic environments using the capsule network architecture. In: Interspeech, pp 132–136

  26. Yin J, Li S, Zhu H, Luo X (2019) Hyperspectral image classification using CAPSNET with well-initialized shallow layers. IEEE Geosci Remote Sens Lett 16(7):1095–1099

    Article  Google Scholar 

  27. Liu Y, Ding Y, Li C, Cheng J, Song R, Wan F, Chen X (2020) Multi-channel EEG-based emotion recognition via a multi-level features guided capsule network. Comput Biol Med 123(103):927

    Google Scholar 

  28. Ali U, Li H, Yao R, Wang Q, Hussain W, ud Duja SB, Amjad M, Ahmed B (2020) EEG emotion signal of artificial neural network by using capsule network. Int J Adv Comput Sci Appl 11(1):434–443

    Google Scholar 

  29. Koelstra S, Muhl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2011) Deap: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3(1):18–31

    Article  Google Scholar 

  30. Correa JAM, Abadi MK, Sebe N, Patras I (2018) Amigos: a dataset for affect, personality and mood research on individuals and groups. IEEE Trans Affect Comput

  31. Katsigiannis S, Ramzan N (2017) Dreamer: a database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices. IEEE J Biomed Health Inform 22(1):98–107

    Article  Google Scholar 

  32. Rahman MA, Anjum A, Milu MMH, Khanam F, Uddin MS, Mollah MN (2021) Emotion recognition from EEG-based relative power spectral topography using convolutional neural network. Array, p 100072

  33. Feng Y, Zhang B, Liu Y, Niu Z, Dai B, Fan Y, Chen X (2021) A 200–225-GHZ manifold-coupled multiplexer utilizing metal waveguides. IEEE Trans Microw Theory Tech

  34. Jiang Y, Li X (2021) Broadband cancellation method in an adaptive co-site interference cancellation system. Int J Electron (just-accepted)

  35. Yan Y, Feng L, Shi M, Cui C, Liu Y (2020) Effect of plasma-activated water on the structure and in vitro digestibility of waxy and normal maize starches during heat-moisture treatment. Food Chem 306(125):589

    Google Scholar 

  36. Shi M, Wang F, Lan P, Zhang Y, Zhang M, Yan Y, Liu Y (2021) Effect of ultrasonic intensity on structure and properties of wheat starch-monoglyceride complex and its influence on quality of norther-style Chinese steamed bread. LWT 138(110):677

    Google Scholar 

  37. Ha KW, Jeong JW (2019) Motor imagery EEG classification using capsule networks. Sensors 19(13):2854

    Article  Google Scholar 

  38. Che H, Wang J (2020) A two-timescale duplex neurodynamic approach to mixed-integer optimization. IEEE Trans Neural Netw Learn Syst 32(1):36–48

    Article  MathSciNet  Google Scholar 

  39. Martínez-Tejada LA, Yoshimura N, Koike Y (2020) Classifier comparison using EEG features for emotion recognition process. In: 2020 IEEE 18th world symposium on applied machine intelligence and informatics (SAMI). IEEE, pp 225–230

  40. Daimi SN, Saha G (2014) Classification of emotions induced by music videos and correlation with participants’ rating. Expert Syst Appl 41(13):6057–6065

    Article  Google Scholar 

  41. Li X, Zhang P, Song D, Yu G, Hou Y, Hu B (2015) EEG based emotion identification using unsupervised deep feature learning

  42. Jirayucharoensak S, Pan-Ngum S, Israsena P (2014) EEG-based emotion recognition using deep learning network with principal component based covariate shift adaptation. Sci World J

  43. Tripathi S, Acharya S, Sharma RD, Mittal S, Bhattacharya S (2017) Using deep and convolutional neural networks for accurate emotion classification on DEAP dataset. In: Proceedings of the thirty-first AAAI conference on artificial intelligence, pp 4746–4752

  44. Yang HC, Lee CC (2019) An attribute-invariant variational learning for emotion recognition using physiology. In: ICASSP 2019–2019 IEEE international conference on acoustics. Speech and signal processing (ICASSP). IEEE, pp 1184–1188

  45. Siddharth S, Jung TP, Sejnowski TJ (2019) Utilizing deep learning towards multi-modal bio-sensing and vision-based affective computing. IEEE Trans Affect Comput 1–1

  46. Liu X, Xie L, Wang Y, Zou J, Xiong J, Ying Z, Vasilakos AV (2020) Privacy and security issues in deep learning: a survey. IEEE Access 9:4566–4593

    Article  Google Scholar 

  47. Debie E, Moustafa N, Vasilakos A (2021) Session invariant EEG signatures using elicitation protocol fusion and convolutional neural network. IEEE Trans Dependable Secure Comput

  48. Shen Z, Luo J, Zimmermann R, Vasilakos AV (2011) Peer-to-peer media streaming: insights and new developments. Proc IEEE 99(12):2089–2109

    Article  Google Scholar 

  49. Afshar P, Mohammadi A, Plataniotis KN (2018) Brain tumor type classification via capsule networks. In: 25th IEEE international conference on image processing (ICIP) (pp 3129-3133). IEEE

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nandini Kumari.

Ethics declarations

Conflict of interest

The authors declare no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kumari, N., Anwar, S. & Bhattacharjee, V. Time series-dependent feature of EEG signals for improved visually evoked emotion classification using EmotionCapsNet. Neural Comput & Applic 34, 13291–13303 (2022). https://doi.org/10.1007/s00521-022-06942-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-06942-x

Keywords

Navigation