Abstract:
In contemporary applications of emotional brain-computer interfaces, the experimental data frequently originate from distinct individuals, leading to considerable inter-i...Show MoreMetadata
Abstract:
In contemporary applications of emotional brain-computer interfaces, the experimental data frequently originate from distinct individuals, leading to considerable inter-individual variance. This variance poses challenges in generalizing models to novel, unseen individuals. This study introduces a novel approach to cross-subject EEG emotion recognition, leveraging a spatial-temporal neural network integrated with an attention mechanism. This approach sequentially extracts spatial and temporal domain features, effectively mitigating inter-subject variability by discerning emotionally salient features through the attention module. This refinement significantly enhances classification performance. The network operates end-to-end, initially eliminating the baseline signal from the input raw EEG data and subsequently segmenting it into windows for preprocessing. Specifically, the spatial attributes of diverse subjects’ EEG data are discerned through a spatial attention-based CNN network, while temporal dynamics are captured via a self-attention-based LSTM network. Finally, the two sets of features are amalgamated for cross-subject EEG emotion recognition. Empirical findings demonstrate that this methodology can derive more distinctive features from the original EEG signals, achieving an average classification accuracy of 89.29% on the DEAP dataset. This constitutes a noteworthy enhancement in classification efficacy compared to alternative methods. These results offer methodological insights pertinent to emotional brain-computer interface systems in authentic scenarios.
Published in: 2023 16th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI)
Date of Conference: 28-30 October 2023
Date Added to IEEE Xplore: 02 January 2024
ISBN Information: