Skip to main content

Advertisement

Log in

EmoPercept: EEG-based emotion classification through perceiver

  • Focus
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

Emotions play an important role in human cognition and are commonly associated with perception, logical decision making, human interaction, and intelligence. Emotion and stress detection is an emerging topic of interest and importance in the research community. With the availability of portable, cheap, and reliable sensor devices, researchers are opting to use physiological signals for emotion classification as they are more prone to human deception, as compared to audiovisual signals. In recent years, deep neural networks have gained popularity and have inspired new ideas for emotion recognition based on electroencephalogram (EEG) signals. Recently, widespread use of transformer-based architectures has been observed, providing state-of-the-art results in several domains, from natural language processing to computer vision, and object detection. In this work, we investigate the effectiveness and accuracy of a novel transformer-based architecture, called perceiver, which claims to be able to handle inputs from any modality, be it an image, audio, or video. We utilize the perceiver architecture on raw EEG signals taken from one of the most widely used publicly available EEG-based emotion recognition datasets, i.e., DEAP, and compare its results with some of the best performing models in the domain.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Abbreviations

EEG :

Electroencephalogram

CNN :

Convolutional neural network

NLP :

Natural language processing

RNN :

Recurrent neural network

DBN :

Deep Belief Network

GCNN :

Graph Convolutional Neural Network

CapsNet :

Capsule Network

LSTM :

Long Short-Term Memory

DT :

Decision Trees

References

  • Alhagry S, Fahmy AA, El-Khoribi RA (2017) Emotion Recognition based on EEG using LSTM Recurrent Neural Network. Int J Adv Comput Sci Appl (IJACSA). 8(10). https://doi.org/10.14569/IJACSA.2017.081046

  • Anderson K, McOwan P (2006) A real-time automated system for the recognition of human facial expressions. IEEE Trans Syst Man Cybernet Part B (Cybernet) 36(1):96–105. https://doi.org/10.1109/TSMCB.2005.854502

    Article  Google Scholar 

  • Brown TB, Mann B, Ryder N, Subbiah M, Kaplan J, Dhariwal P, Neelakantan A, Shyam P, Sastry G, Askell A, Agarwal S, Herbert-Voss A, Krueger G, Henighan T, Child R, Ramesh A, Ziegler DM, Wu J, Winter C, Hesse C, Chen M, Sigler E, Litwin M, Gray S, Chess B, Clark J, Berner C, McCandlish S, Radford A, Sutskever I, Amodei D (2020) Language Models are Few-Shot Learners. arXiv:2005.14165 [cs]

  • Chao H, Dong L, Liu Y, Lu B (2019) Emotion recognition from multiband EEG signals using capsnet. Sensors 19(9):2212. https://doi.org/10.3390/s19092212

    Article  Google Scholar 

  • Chen JX, Jiang DM, Zhang YN (2019) A hierarchical bidirectional GRU model with attention for EEG-based emotion classification. IEEE Access 7:118530–118540. https://doi.org/10.1109/ACCESS.2019.2936817

    Article  Google Scholar 

  • Collobert R, Weston J (2008) A unified architecture for natural language processing: deep neural networks with multitask learning. In Proceedings of the 25th international conference on Machine learning, ICML ’08, New York, NY, USA. Association for Computing Machinery, pp 160–167

  • Deng X, Zhu J, Yang S (2021) SFE-Net: EEG-based Emotion Recognition with Symmetrical Spatial Feature Extraction. arXiv:2104.06308 [cs, eess]

  • Ding Y, Robinson N, Zeng Q, Guan C (2021) April. TSception: Capturing Temporal Dynamics and Spatial Asymmetry from EEG for Emotion Recognition. arXiv:2104.02935 [cs]

  • Halim Z, Atif M, Rashid A, Edwin CA (2017) Profiling players using real-world datasets: clustering the data and correlating the results with the big-five personality traits. IEEE Trans Affect Comput. https://doi.org/10.1109/TAFFC.2017.2751602

    Article  Google Scholar 

  • Halim Z, Rehan M (2020) On identification of driving-induced stress using electroencephalogram signals: a framework based on wearable safety-critical scheme and machine learning. Inf Fusion 53:66–79. https://doi.org/10.1016/j.inffus.2019.06.006

    Article  Google Scholar 

  • He K, Zhang X, Ren S, Sun J (2016) Deep Residual Learning for Image Recognition. In 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp 770–778. ISSN: 1063-6919

  • Jaegle A, Gimeno F, Brock A, Zisserman A, Vinyals O, Carreira J (2021)Perceiver: general perception with iterative attention. arXiv:2103.03206 [cs, eess]

  • Koelstra S, Muhl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2012) DEAP: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3(1):18–31. https://doi.org/10.1109/T-AFFC.2011.15

    Article  Google Scholar 

  • Lecun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324. https://doi.org/10.1109/5.726791

    Article  Google Scholar 

  • Liu H, Guo H, Hu W (2021) EEG-based Emotion Classification Using Joint Adaptation Networks. In 2021 IEEE international symposium on circuits and systems (ISCAS), pp 1–5. ISSN: 2158-1525

  • Liu X, He P, Chen W, Gao J (2019) Multi-Task Deep Neural Networks for Natural Language Understanding. arXiv:1901.11504 [cs]

  • Liu Y, Ding Y, Li C, Cheng J, Song R, Wan F, Chen X (2020) Multi-channel EEG-based emotion recognition via a multi-level features guided capsule network. Comput Biol Med 123:103927. https://doi.org/10.1016/j.compbiomed.2020.103927

    Article  Google Scholar 

  • Liu Z, Lin Y, Cao Y, Hu H, Wei Y, Zhang Z, Lin S, Guo B (2021) Swin Transformer: hierarchical vision transformer using shifted windows. arXiv:2103.14030 [cs]

  • Muhammad T, Halim Z (2016) Employing artificial neural networks for constructing metadata-based model to automatically select an appropriate data visualization technique. Appl Soft Comput 49(C):365–384. https://doi.org/10.1016/j.asoc.2016.08.039

    Article  Google Scholar 

  • Nawaz R, Cheah KH, Nisar H, Yap VV (2020) Comparison of different feature extraction methods for EEG-based emotion recognition. Biocybernet Biomed Eng 40(3):910–926. https://doi.org/10.1016/j.bbe.2020.04.005

    Article  Google Scholar 

  • Petrushin V (2000) Emotion in speech: recognition and application to call centers. Proceedings of artificial neural networks in engineering

  • Ramesh A, Pavlov M, Goh G, Gray S, Voss C, Radford A, Chen M, Sutskever I (2021) Zero-Shot Text-to-Image Generation. arXiv:2102.12092 [cs]

  • Sabour S, Frosst N, Hinton GE (2017) Dynamic Routing Between Capsules. arXiv:1710.09829 [cs]

  • Soleymani M, Pantic M, Pun T (2012) Multimodal emotion recognition in response to videos. IEEE Trans Affect Comput 3(2):211–223. https://doi.org/10.1109/T-AFFC.2011.37

    Article  Google Scholar 

  • Song T, Zheng W, Song P, Cui Z (2020) EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Trans Affect Comput 11(3):532–541. https://doi.org/10.1109/TAFFC.2018.2817622

    Article  Google Scholar 

  • Tao W, Li C, Song R, Cheng J, Liu Y, Wan F, Chen X (2020) EEG-based emotion recognition via channel-wise attention and self attention. IEEE Trans Affect Comput. https://doi.org/10.1109/TAFFC.2020.3025777

    Article  Google Scholar 

  • Tripathi S, Acharya S., Sharma RD, Mittal S, Bhattacharya S (2017)Using Deep and Convolutional Neural Networks for Accurate Emotion Classification on DEAP Dataset. In twenty-ninth IAAI conference

  • Uzma, Halim Z (2021) An ensemble filter-based heuristic approach for cancerous gene expression classification. Knowl-Based Syst 234:107560. https://doi.org/10.1016/j.knosys.2021.107560

    Article  Google Scholar 

  • Wang Y, Huang Z, McCane B, Neo P (2018) EmotioNet: a 3-D convolutional neural network for EEG-based emotion recognition. In 2018 international joint conference on neural networks (IJCNN), pp 1–7. ISSN: 2161-4407

  • Wu X, Zheng WL, Lu BL (2020) Investigating EEG-based functional connectivity patterns for multimodal emotion recognition. arXiv:2004.01973 [cs]

  • Xiao G, Ye M, Xu B, Chen Z, Ren Quansheng (2021) 4D attention-based neural network for EEG emotion recognition. arXiv:2101.05484 [cs]

  • Yang Y, Wu Q, Fu Y, Chen X (2018) Continuous convolutional neural network with 3D input for EEG-based emotion recognition. In: Cheng L, Leung ACS, Ozawa S (eds) Neural information processing. Lecture notes in computer science. Springer International Publishing, Cham, pp 433–443

    Chapter  Google Scholar 

  • Yao Z, Wang Z, Liu W, Liu Y, Pan J (2020) Speech emotion recognition using fusion of three multi-task learning-based classifiers: HSF-DNN. MS-CNN and LLD-RNN. Speech Commun 120:11–19. https://doi.org/10.1016/j.specom.2020.03.005

    Article  Google Scholar 

  • Yin Y, Zheng X, Hu B, Zhang Y, Cui X (2021) EEG emotion recognition using fusion model of graph convolutional neural networks and LSTM. Appl Soft Comput 100:106954. https://doi.org/10.1016/j.asoc.2020.106954

    Article  Google Scholar 

  • Yuan L, Chen Y, Wang T, Yu W, Shi Y, Jiang Z, Tay FE, Feng J, Yan S (2021) Tokens-to-Token ViT: training vision transformers from scratch on imagenet. arXiv:2101.11986 [cs]

  • Zhang D, Yao L, Chen K, Monaghan J (2019) A convolutional recurrent attention model for subject-independent EEG signal analysis. IEEE Signal Process Lett 26(5):715–719. https://doi.org/10.1109/LSP.2019.2906824

    Article  Google Scholar 

  • Zhang G, Etemad A (2021) Distilling EEG Representations via Capsules for Affective Computing. arXiv: 2105.00104 [cs]

  • Zhang G, Yu M, Liu YJ, Zhao G, Zhang D, Zheng W (2021) SparseDGCNN: recognizing emotion from multichannel EEG signals. IEEE Trans Affect Comput. https://doi.org/10.1109/TAFFC.2021.3051332

    Article  Google Scholar 

  • Zheng WL, Zhu JY, Peng Y, Lu BL (2014) EEG-based emotion classification using deep belief networks. In 2014 IEEE international conference on multimedia and expo (ICME), pp 1–6. ISSN: 1945-788X

Download references

Acknowledgements

The authors would like to thank GIK Institute for providing research facilities. This work was sponsored by the Higher Education Commission (HEC) under the National Research Programme for Universities (NRPU) having project number 8898.

Author information

Authors and Affiliations

Authors

Contributions

The authors contributed to each part of this paper equally. The authors read and approved the final manuscript.

Corresponding author

Correspondence to Zahid Halim.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Human and animal rights

This article does not contain any studies with human participants performed by any of the authors.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Communicated by Shah Nazir.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Aadam, Tubaishat, A., Al-Obeidat, F. et al. EmoPercept: EEG-based emotion classification through perceiver. Soft Comput 26, 10563–10570 (2022). https://doi.org/10.1007/s00500-021-06578-4

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-021-06578-4

Keywords