ABSTRACT
Affective Brain-computer Interface has achieved considerable advances that researchers can successfully interpret labeled and flawless EEG data collected in laboratory settings. However, the annotation of EEG data is time-consuming and requires a vast workforce which limits the application in practical scenarios. Furthermore, daily collected EEG data may be partially damaged since EEG signals are sensitive to noise. In this paper, we propose a Multi-view Spectral-Spatial-Temporal Masked Autoencoder (MV-SSTMA) with self-supervised learning to tackle these challenges towards daily applications. The MV-SSTMA is based on a multi-view CNN-Transformer hybrid structure, interpreting the emotion-related knowledge of EEG signals from spectral, spatial, and temporal perspectives. Our model consists of three stages: 1) In the generalized pre-training stage, channels of unlabeled EEG data from all subjects are randomly masked and later reconstructed to learn the generic representations from EEG data; 2) In the personalized calibration stage, only few labeled data from a specific subject are used to calibrate the model; 3) In the personal test stage, our model can decode personal emotions from the sound EEG data as well as damaged ones with missing channels. Extensive experiments on two open emotional EEG datasets demonstrate that our proposed model achieves state-of-the-art performance on emotion recognition. In addition, under the abnormal circumstance of missing channels, the proposed model can still effectively recognize emotions.
- Salma Alhagry, Aly Aly Fahmy, and Reda A El-Khoribi. 2017. Emotion recognition based on EEG using LS™ recurrent neural network. Emotion, Vol. 8, 10 (2017), 355--358.Google Scholar
- Hubert Banville, Omar Chehab, Aapo Hyv"arinen, Denis-Alexander Engemann, and Alexandre Gramfort. 2021. Uncovering the structure of clinical EEG signals with self-supervised learning. Journal of Neural Engineering , Vol. 18, 4 (2021), 046020.Google ScholarCross Ref
- Andrey V Bocharov, Gennady G Knyazev, and Alexander N Savostyanov. 2017. Depression and implicit emotion processing: An EEG study. Neurophysiologie Clinique/Clinical Neurophysiology, Vol. 47, 3 (2017), 225--230.Google ScholarCross Ref
- Clemens Brunner, Niels Birbaumer, Benjamin Blankertz, Christoph Guger, Andrea Kübler, Donatella Mattia, José del R Millán, Felip Miralles, Anton Nijholt, Eloy Opisso, et al. 2015. BNCI Horizon 2020: towards a roadmap for the BCI community. Brain-Computer Interfaces , Vol. 2, 1 (2015), 1--10.Google ScholarCross Ref
- Ting Chen, Simon Kornblith, Mohammad Norouzi, and Geoffrey Hinton. 2020. A simple framework for contrastive learning of visual representations. In International Conference on Machine Learning. PMLR, 1597--1607.Google Scholar
- Ruo-Nan Duan, Jia-Yi Zhu, and Bao-Liang Lu. 2013. Differential entropy feature for EEG-based emotion classification. In 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER). IEEE, 81--84.Google ScholarCross Ref
- Lester I Goldfischer. 1965. Autocorrelation function and power spectral density of laser-produced speckle patterns. Josa, Vol. 55, 3 (1965), 247--253.Google ScholarCross Ref
- Matti H"am"al"ainen, Riitta Hari, Risto J Ilmoniemi, Jukka Knuutila, and Olli V Lounasmaa. 1993. Magnetoencephalography-theory, instrumentation, and applications to noninvasive studies of the working human brain. Reviews of Modern Physics , Vol. 65, 2 (1993), 413.Google Scholar
- Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, and Ross Girshick. 2021. Masked autoencoders are scalable vision learners. arXiv preprint arXiv:2111.06377 (2021).Google Scholar
- Ziyu Jia, Youfang Lin, Xiyang Cai, Haobin Chen, Haijun Gou, and Jing Wang. 2020. SST-EmotionNet: Spatial-spectral-temporal based attention 3d dense network for EEG emotion recognition. In Proceedings of the 28th ACM International Conference on Multimedia. 2909--2917.Google ScholarDigital Library
- Xue Jiang, Jianhui Zhao, Bo Du, and Zhiyong Yuan. 2021. Self-supervised Contrastive Learning for EEG-based Sleep Staging. In 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 1--8.Google Scholar
- Demetres Kostas, Stephane Aroca-Ouellette, and Frank Rudzicz. 2021. BENDR: using transformers and a contrastive self-supervised learning task to learn from massive amounts of EEG data. arXiv preprint arXiv:2101.12037 (2021).Google Scholar
- Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. 2015. Deep learning. nature, Vol. 521, 7553 (2015), 436--444.Google Scholar
- Rui Li, Yiting Wang, and Bao-Liang Lu. 2021. A Multi-Domain Adaptive Graph Convolutional Network for EEG-based Emotion Recognition. In Proceedings of the 29th ACM International Conference on Multimedia. 5565--5573.Google ScholarDigital Library
- Yang Li, Lei Wang, Wenming Zheng, Yuan Zong, Lei Qi, Zhen Cui, Tong Zhang, and Tengfei Song. 2020. A novel bi-hemispheric discrepancy model for EEG emotion recognition. IEEE Transactions on Cognitive and Developmental Systems, Vol. 13, 2 (2020), 354--367.Google ScholarCross Ref
- Yang Li, Wenming Zheng, Zhen Cui, Tong Zhang, and Yuan Zong. 2018. A Novel Neural Network Model based on Cerebral Hemispheric Asymmetry for EEG Emotion Recognition.. In IJCAI. 1561--1567.Google Scholar
- Yang Li, Wenming Zheng, Lei Wang, Yuan Zong, and Zhen Cui. 2019. From regional to global brain: A novel hierarchical spatial-temporal neural network model for EEG emotion recognition. IEEE Transactions on Affective Computing (2019).Google Scholar
- Xiao Liu, Fanjin Zhang, Zhenyu Hou, Li Mian, Zhaoyu Wang, Jing Zhang, and Jie Tang. 2021. Self-supervised learning: Generative or contrastive. IEEE Transactions on Knowledge and Data Engineering (2021).Google Scholar
- Yisi Liu and Olga Sourina. 2013. Real-time fractal-based valence level recognition from EEG. In Transactions on computational science XVIII. Springer, 101--120.Google Scholar
- Mostafa Neo Mohsenvand, Mohammad Rasool Izadi, and Pattie Maes. 2020. Contrastive representation learning for electroencephalogram classification. In Machine Learning for Health. PMLR, 238--253.Google Scholar
- Femke Nijboer, Fabrice O Morin, Stefan P Carmien, Randal A Koene, Enrique Leon, and Ulrich Hoffmann. 2009. Affective brain-computer interfaces: Psychophysiological markers of emotion in healthy persons and in persons with amyotrophic lateral sclerosis. In 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops. IEEE, 1--11.Google ScholarCross Ref
- Adam Paszke, Sam Gross, Soumith Chintala, Gregory Chanan, Edward Yang, Zachary DeVito, Zeming Lin, Alban Desmaison, Luca Antiga, and Adam Lerer. 2017. Automatic Differentiation in Pytorch. (2017).Google Scholar
- Tengfei Song, Wenming Zheng, Peng Song, and Zhen Cui. 2018. EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Transactions on Affective Computing , Vol. 11, 3 (2018), 532--541.Google ScholarCross Ref
- Babak A Taheri, Robert T Knight, and Rosemary L Smith. 1994. A dry electrode for EEG recording. Electroencephalography and Clinical Neurophysiology, Vol. 90, 5 (1994), 376--383.Google ScholarCross Ref
- Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. In Advances in Neural Information Processing Systems. 5998--6008.Google Scholar
- Pascal Vincent, Hugo Larochelle, Yoshua Bengio, and Pierre-Antoine Manzagol. 2008. Extracting and composing robust features with denoising autoencoders. In Proceedings of the 25th international conference on Machine learning. 1096--1103.Google ScholarDigital Library
- Tong Zhang, Wenming Zheng, Zhen Cui, Yuan Zong, and Yang Li. 2018. Spatial--temporal recurrent neural network for emotion recognition. IEEE Transactions on Cybernetics , Vol. 49, 3 (2018), 839--847.Google ScholarCross Ref
- Wei-Long Zheng, Wei Liu, Yifei Lu, Bao-Liang Lu, and Andrzej Cichocki. 2018. Emotionmeter: A multimodal framework for recognizing human emotions. IEEE Transactions on Cybernetics , Vol. 49, 3 (2018), 1110--1122.Google ScholarCross Ref
- Wei-Long Zheng and Bao-Liang Lu. 2015. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Transactions on Autonomous Mental Development, Vol. 7, 3 (2015), 162--175.Google ScholarDigital Library
- Wei-Long Zheng, Jia-Yi Zhu, and Bao-Liang Lu. 2017. Identifying stable patterns over time for emotion recognition from EEG. IEEE Transactions on Affective Computing , Vol. 10, 3 (2017), 417--429.Google ScholarDigital Library
- Peixiang Zhong, Di Wang, and Chunyan Miao. 2020. EEG-based emotion recognition using regularized graph neural networks. IEEE Transactions on Affective Computing (2020).ioGoogle ScholarCross Ref
Index Terms
- A Multi-view Spectral-Spatial-Temporal Masked Autoencoder for Decoding Emotions with Self-supervised Learning
Recommendations
A Multi-Domain Adaptive Graph Convolutional Network for EEG-based Emotion Recognition
MM '21: Proceedings of the 29th ACM International Conference on MultimediaAmong all solutions of emotion recognition tasks, electroencephalogram (EEG) is a very effective tool and has received broad attention from researchers. In addition, information across multimedia in EEG often provides a more complete picture of ...
MBrain: A Multi-channel Self-Supervised Learning Framework for Brain Signals
KDD '23: Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data MiningBrain signals are important quantitative data for understanding physiological activities and diseases of human brain. Meanwhile, rapidly developing deep learning methods offer a wide range of opportunities for better modeling brain signals, which has ...
A real-time classification algorithm for EEG-based BCI driven by self-induced emotions
Background and objectiveThe aim of this paper is to provide an efficient, parametric, general, and completely automatic real time classification method of electroencephalography (EEG) signals obtained from self-induced emotions. The particular ...
Comments