Multi-Grasp Classification for the Control of Robot Hands Employing Transformers and Lightmyography Signals | IEEE Conference Publication | IEEE Xplore

Multi-Grasp Classification for the Control of Robot Hands Employing Transformers and Lightmyography Signals


Abstract:

The increasing use of smart technical devices in our everyday lives has necessitated the use of muscle-machine interfaces (MuMI) that are intuitive and that can facilitat...Show More

Abstract:

The increasing use of smart technical devices in our everyday lives has necessitated the use of muscle-machine interfaces (MuMI) that are intuitive and that can facilitate immersive interactions with these devices. The most common method to develop MuMIs is using Electromyography (EMG) based signals. However, due to several drawbacks of EMG-based interfaces, alternative methods to develop MuMI are being explored. In our previous work, we presented a new MuMI called Lightmyography (LMG), which achieved outstanding results compared to a classic EMG-based interface in a five-gesture classification task. In this study, we extend our previous work experimentally validating the efficiency of the LMG armband in classifying thirty-two different gestures from six participants using a deep learning technique called Temporal Multi-Channel Vision Transformers (TMC-ViT). The efficiency of the proposed model was assessed using accuracy. Moreover, two different undersampling techniques are compared. The proposed thirty-two-gesture classifiers achieve accuracies as high as 92%. Finally, we employ the LMG interface in the real-time control of a robotic hand using ten different gestures, successfully reproducing several grasp types from taxonomy grasps presented in the literature.
Date of Conference: 24-27 July 2023
Date Added to IEEE Xplore: 11 December 2023
ISBN Information:

ISSN Information:

PubMed ID: 38082669
Conference Location: Sydney, Australia

Contact IEEE to Subscribe

References

References is not available for this document.