Skip to main content

Multi-modal Decoding of Reach-to-Grasping from EEG and EMG via Neural Networks

  • Conference paper
  • First Online:
Artificial Neural Networks in Pattern Recognition (ANNPR 2024)

Abstract

Convolutional neural networks (CNNs) have revolutionized motor decoding from electroencephalographic (EEG) signals, showcasing their ability to outperform traditional machine learning, especially for Brain-Computer Interface (BCI) applications. By processing also other recording modalities (e.g., electromyography, EMG) together with EEG signals, motor decoding improved. However, multi-modal algorithms for decoding hand movements are mainly applied to simple movements (e.g., wrist flexion/extension), while their adoption for decoding complex movements (e.g., different grip types) is still under-investigated. In this study, we recorded EEG and EMG signals from 12 participants while they performed a delayed reach-to-grasping task towards one out of four possible objects (a handle, a pin, a card, and a ball), and we addressed multi-modal EEG+EMG decoding with a dual-branch CNN. Each branch of the CNN was based on EEGNet. The performance of the multi-modal approach was compared to mono-modal baselines (based on EEG or EMG only). The multi-modal EEG+EMG pipeline outperformed the EEG-based pipeline during movement initiation, while it outperformed the EMG-based pipeline in motor preparation. Finally, the multi-modal approach was capable of accurately discriminating between grip types widely during the task, especially from movement initiation. Our results further validate multi-modal decoding for potential future BCI applications, aiming at achieving a more natural user experience.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    The study was approved by the Bioethics Committee of the University of Bologna (protocol code: 61243, date of approval: 15 March 2021).

References

  1. Borra, D., Bossi, F., Rivolta, D., Magosso, E.: Deep learning applied to EEG source-data reveals both ventral and dorsal visual stream involvement in holistic processing of social stimuli. Sci. Rep. 13(1) (2023). http://dx.doi.org/10.1038/s41598-023-34487-z

  2. Borra, D., Fantozzi, S., Bisi, M.C., Magosso, E.: Modulations of cortical power and connectivity in alpha and beta bands during the preparation of reaching movements. Sensors 23(7), 3530 (2023). http://dx.doi.org/10.3390/s23073530

  3. Borra, D., Fantozzi, S., Magosso, E.: EEG motor execution decoding via interpretable Sinc-convolutional neural networks. In: Henriques, J., Neves, N., de Carvalho, P. (eds.) MEDICON 2019. IP, vol. 76, pp. 1113–1122. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-31635-8_135

    Chapter  Google Scholar 

  4. Borra, D., Filippini, M., Ursino, M., Fattori, P., Magosso, E.: Motor decoding from the posterior parietal cortex using deep neural networks. J. Neural Eng. 20(3), 036016 (2023). http://dx.doi.org/10.1088/1741-2552/acd1b6

  5. Borra, D., Filippini, M., Ursino, M., Fattori, P., Magosso, E.: Convolutional neural networks reveal properties of reach-to-grasp encoding in posterior parietal cortex. Comput. Biol. Med. 172, 108188 (2024). http://dx.doi.org/10.1016/j.compbiomed.2024.108188

  6. Borra, D., Magosso, E.: Deep learning-based EEG analysis: investigating P3 ERP components. J. Integr. Neurosci. 20(4), 791–811 (2021). http://dx.doi.org/10.31083/j.jin2004083

  7. Borra, D., Mondini, V., Magosso, E., Müller-Putz, G.R.: Decoding movement kinematics from EEG using an interpretable convolutional neural network. Comput. Biol. Med. 165, 107323 (2023). http://dx.doi.org/10.1016/j.compbiomed.2023.107323

  8. Deng, X., Zhang, B., Yu, N., Liu, K., Sun, K.: Advanced TSGL-EEGNet for motor imagery EEG-based brain-computer interfaces. IEEE Access 9, 25118–25130 (2021). http://dx.doi.org/10.1109/ACCESS.2021.3056088

  9. Dremstrup, K., Gu, Y., Nascimento, O.F.D., Farina, D.: Movement-related cortical potentials and their application in brain-computer interfacing. In: Farina, D., Jensen, W., Akay, M. (eds.) Introduction to Neural Engineering for Motor Rehabilitation (2013). http://dx.doi.org/10.1002/9781118628522.ch13

  10. Filippini, M., Borra, D., Ursino, M., Magosso, E., Fattori, P.: Decoding sensorimotor information from superior parietal lobule of macaque via convolutional neural networks. Neural Netw. 151, 276-294 (2022). http://dx.doi.org/10.1016/j.neunet.2022.03.044

  11. Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24(6), 381–395 (1981). http://dx.doi.org/10.1145/358669.358692

  12. Kim, S., Shin, D.Y., Kim, T., Lee, S., Hyun, J.K., Park, S.M.: Enhanced recognition of amputated wrist and hand movements by deep learning method using multimodal fusion of electromyography and electroencephalography. Sensors 22(2), 680 (2022). http://dx.doi.org/10.3390/s22020680

  13. Lawhern, V.J., Solon, A.J., Waytowich, N.R., Gordon, S.M., Hung, C.P., Lance, B.J.: EEGNet: a compact convolutional neural network for EEG-based brain–computer interfaces. J. Neural Eng. 15(5), 056013 (2018). https://doi.org/10.1088/1741-2552/aace8c

  14. Leeb, R., Sagha, H., Chavarriaga, R., Millán, J.D.R.: A hybrid brain-computer interface based on the fusion of electroencephalographic and electromyographic activities. J. Neural Eng. 8(2), 025011 (2011). http://dx.doi.org/10.1088/1741-2560/8/2/025011

  15. Lotte, F., et al.: A review of classification algorithms for EEG-based brain-computer interfaces: a 10 year update. J. Neural Eng. 15(3), 031005 (2018). http://dx.doi.org/10.1088/1741-2552/aab2f2

  16. Matran-Fernandez, A., Rodríguez Martínez, I.J., Poli, R., Cipriani, C., Citi, L.: Seeds, simultaneous recordings of high-density EMG and finger joint angles during multiple hand movements. Sci. Data 6(1) (2019). http://dx.doi.org/10.1038/s41597-019-0200-9

  17. Neuper, C., Wörtz, M., Pfurtscheller, G.: ERD/ERS patterns reflecting sensorimotor activation and deactivation, pp. 211–222. Elsevier (2006). http://dx.doi.org/10.1016/S0079-6123(06)59014-4

  18. Ofner, P., Schwarz, A., Pereira, J., Müller-Putz, G.R.: Upper limb movements can be decoded from the time-domain of low-frequency EEG. PLOS ONE 12(8), e0182578 (2017). http://dx.doi.org/10.1371/journal.pone.0182578

  19. Riyad, M., Khalil, M., Adib, A.: MI-EEGNET: a novel convolutional neural network for motor imagery classification. J. Neurosci. Methods 353, 109037 (2021). http://dx.doi.org/10.1016/j.jneumeth.2020.109037

  20. Roy, Y., Banville, H., Albuquerque, I., Gramfort, A., Falk, T.H., Faubert, J.: Deep learning-based electroencephalography analysis: a systematic review. J. Neural Eng. 16(5), 051001 (2019). http://dx.doi.org/10.1088/1741-2552/ab260c

  21. Schirrmeister, R.T., et al.: Deep learning with convolutional neural networks for EEG decoding and visualization. Hum. Brain Mapp. 38(11), 5391–5420 (2017). https://doi.org/10.1002/hbm.23730

  22. Simões, M., et al.: BCIAUT-P300: a multi-session and multi-subject benchmark dataset on autism for p300-based brain-computer-interfaces. Front. Neurosci. 14 (2020). http://dx.doi.org/10.3389/fnins.2020.568104

  23. Wolpaw, J.R., Birbaumer, N., McFarland, D.J., Pfurtscheller, G., Vaughan, T.M.: Brain-computer interfaces for communication and control. Clin. Neurophysiol. 113(6), 767–791 (2002). http://dx.doi.org/10.1016/s1388-2457(02)00057-3

  24. Xu, D., Tang, F., Li, Y., Zhang, Q., Feng, X.: An analysis of deep learning models in SSVEP-based BCI: a survey. Brain Sci. 13(3), 483 (2023). http://dx.doi.org/10.3390/brainsci13030483

Download references

Acknowledgments

This work is supported by #NEXTGENERATIONEU (NGEU) and funded by the Ministry of University and Research (MUR), National Recovery and Resilience Plan (NRRP), project MNESYS (PE0000006)—A Multiscale integrated approach to the study of the nervous system in health and disease (DN. 1553 11.10.2022).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Davide Borra .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Borra, D., Fraternali, M., Ravanelli, M., Magosso, E. (2024). Multi-modal Decoding of Reach-to-Grasping from EEG and EMG via Neural Networks. In: Suen, C.Y., Krzyzak, A., Ravanelli, M., Trentin, E., Subakan, C., Nobile, N. (eds) Artificial Neural Networks in Pattern Recognition. ANNPR 2024. Lecture Notes in Computer Science(), vol 15154. Springer, Cham. https://doi.org/10.1007/978-3-031-71602-7_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-71602-7_15

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-71601-0

  • Online ISBN: 978-3-031-71602-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics