Loading [a11y]/accessibility-menu.js
Multimodal Fusion of EEG and EMG Signals Using Self-Attention Multi-Temporal Convolutional Neural Networks for Enhanced Hand Gesture Recognition in Rehabilitation | IEEE Conference Publication | IEEE Xplore

Multimodal Fusion of EEG and EMG Signals Using Self-Attention Multi-Temporal Convolutional Neural Networks for Enhanced Hand Gesture Recognition in Rehabilitation


Abstract:

In this work, we introduce an innovative approach to hand gesture recognition aimed at rehabilitation applications, utilising the synergistic potential of multimodal data...Show More

Abstract:

In this work, we introduce an innovative approach to hand gesture recognition aimed at rehabilitation applications, utilising the synergistic potential of multimodal data fusion from electroencephalogram (EEG) and electromyogram (EMG) sensors. Our approach exploits the strength of Self-Attention Multi-Temporal Convolutional Networks (SAMTCN), which adeptly combine the distinct and complementary insights provided by EEG and EMG signals. The core of our methodology is the strategic application of self-attention mechanisms with multi-temporal convolutional architectures. This design choice allows our model to capture and analyse temporal patterns in multimodal data with unprecedented precision, significantly enhancing its ability to generalise to new, unseen data. The effectiveness of our approach is evidenced by the model's exceptional performance, achieving an accuracy of over 97% in recognising diverse hand gestures. This high level of accuracy highlights the model's potential to revolutionise how interactions are facilitated in rehabilitation contexts.
Date of Conference: 29-31 July 2024
Date Added to IEEE Xplore: 15 August 2024
ISBN Information:

ISSN Information:

Conference Location: London, United Kingdom

References

References is not available for this document.