Skip to main content

Intention Recognition from Spatio-Temporal Representation of EEG Signals

  • Conference paper
  • First Online:
Databases Theory and Applications (ADC 2021)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12610))

Included in the following conference series:

Abstract

The motor imagery brain-computer interface uses the human brain intention to achieve better control. The main technical problems are feature representation and classification of signal features for specific thinking activities. Inspired by the structure and function of the human brain, we construct a neural computing model to explore the critical issues in the representation and real-time recognition of the state of specific thinking activities. In consideration of the physiological structure and the information processing process of the brain, we construct a multi-scale cascaded Conv-GRU model and extract high-resolution feature information from the dual spatio-temporal dimension, effectively removing signal noise, improving the signal-to-noise ratio, and reducing information loss. Extensive experiments demonstrate that our model has a low dependence on training data size and outperforms state-of-the-art multi-intention recognition methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://physionet.org/pn4/eegmmidb/.

  2. 2.

    https://drive.google.com/drive/folders/0B9MuJb6Xx2PIM0otakxuVHpkWkk.

References

  1. Agarwal, K., Guo, Y.X.: Interaction of electromagnetic waves with humans in wearable and biomedical implant antennas. In: 2015 Asia-Pacific Symposium on Electromagnetic Compatibility (APEMC), pp. 154–157. IEEE (2015)

    Google Scholar 

  2. Alomari, M.H., AbuBaker, A., Turani, A., Baniyounes, A.M., Manasreh, A.: EEG mouse: a machine learning-based brain computer interface. Int. J. Adv. Comput. Sci. Appl. 5(4), 193–198 (2014)

    Google Scholar 

  3. Anumanchipalli, G.K., Chartier, J., Chang, E.F.: Speech synthesis from neural decoding of spoken sentences. Nature 568(7753), 493 (2019)

    Article  Google Scholar 

  4. Bashivan, P., Rish, I., Yeasin, M., Codella, N.: Learning representations from EEG with deep recurrent-convolutional neural networks. arXiv preprint arXiv:1511.06448 (2015)

  5. Behncke, J., Schirrmeister, R.T., Burgard, W., Ball, T.: The signature of robot action success in EEG signals of a human observer: Decoding and visualization using deep convolutional neural networks. In: 2018 6th International Conference on Brain-Computer Interface (BCI), pp. 1–6. IEEE (2018)

    Google Scholar 

  6. Biryukova, E., et al.: Arm motor function recovery during rehabilitation with the use of hand exoskeleton controlled by brain-computer interface: a patient with severe brain damage. Fiziol. Cheloveka 42(1), 19–30 (2016)

    Google Scholar 

  7. Chen, W., et al.: EEG-based motion intention recognition via multi-task RNNs. In: Proceedings of the 2018 SIAM International Conference on Data Mining, pp. 279–287. SIAM (2018)

    Google Scholar 

  8. Fiala, P., Hanzelka, M., Čáp, M.: Electromagnetic waves and mental synchronization of humans in a large crowd. In: 2017 11th International Conference on Measurement, pp. 241–244. IEEE (2017)

    Google Scholar 

  9. Frolov, A.A., Húsek, D., Biryukova, E.V., Bobrov, P.D., Mokienko, O.A., Alexandrov, A.: Principles of motor recovery in post-stroke patients using hand exoskeleton controlled by the brain-computer interface based on motor imagery. Neural Netw. World 27(1), 107 (2017)

    Article  Google Scholar 

  10. Han, C., O’Sullivan, J., Luo, Y., Herrero, J., Mehta, A.D., Mesgarani, N.: Speaker-independent auditory attention decoding without access to clean speech sources. Sci. Adv. 5(5), eaav6134 (2019)

    Google Scholar 

  11. He, K., Zhang, X., Ren, S., Sun, J.: Spatial pyramid pooling in deep convolutional networks for visual recognition. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1904–1916 (2015)

    Article  Google Scholar 

  12. Kaiser, A.K., Doppelmayr, M., Iglseder, B.: EEG beta 2 power as surrogate marker for memory impairment: a pilot study. Int. Psychogeriatr. 29(9), 1515–1523 (2017)

    Article  Google Scholar 

  13. Kim, Y., Ryu, J., Kim, K.K., Took, C.C., Mandic, D.P., Park, C.: Motor imagery classification using mu and beta rhythms of EEG with strong uncorrelating transform based complex common spatial patterns. Comput. Intell. Neurosci. 2016, 1 (2016)

    Google Scholar 

  14. Lotte, F., Congedo, M., Lécuyer, A., Lamarche, F., Arnaldi, B.: A review of classification algorithms for EEG-based brain-computer interfaces. J. Neural Eng. 4(2), R1 (2007)

    Article  Google Scholar 

  15. Moore, M.R., Franz, E.A.: Mu rhythm suppression is associated with the classification of emotion in faces. Cogn. Affect. Behav. Neurosci. 17(1), 224–234 (2016). https://doi.org/10.3758/s13415-016-0476-6

    Article  Google Scholar 

  16. or Rashid, M.M., Ahmad, M.: Classification of motor imagery hands movement using Levenberg-Marquardt algorithm based on statistical features of EEG signal. In: 2016 3rd International Conference on Electrical Engineering and Information Communication Technology (ICEEICT), pp. 1–6. IEEE (2016)

    Google Scholar 

  17. Schirrmeister, R.T., et al.: Deep learning with convolutional neural networks for EEG decoding and visualization. Hum. Brain Mapp. 38(11), 5391–5420 (2017)

    Article  Google Scholar 

  18. Shenoy, H.V., Vinod, A.P., Guan, C.: Shrinkage estimator based regularization for EEG motor imagery classification. In: 2015 10th International Conference on Information, Communications and Signal Processing (ICICS), pp. 1–5. IEEE (2015)

    Google Scholar 

  19. Sita, J., Nair, G.: Feature extraction and classification of EEG signals for mapping motor area of the brain. In: 2013 International Conference on Control Communication and Computing (ICCC), pp. 463–468. IEEE (2013)

    Google Scholar 

  20. Song, S., Miao, Z.: Research on vehicle type classification based on spatial pyramid representation and BP neural network. In: Zhang, Y.-J. (ed.) ICIG 2015. LNCS, vol. 9219, pp. 188–196. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-21969-1_17

    Chapter  Google Scholar 

  21. Sors, A., Bonnet, S., Mirek, S., Vercueil, L., Payen, J.F.: A convolutional neural network for sleep stage scoring from raw single-channel EEG. Biomed. Signal Process. Control 42, 107–114 (2018)

    Article  Google Scholar 

  22. Tatum, W.O.: Ellen R. grass lecture: extraordinary EEG. Neurodiagnostic J. 54(1), 3–21 (2014)

    Google Scholar 

  23. Tsinalis, O., Matthews, P.M., Guo, Y., Zafeiriou, S.: Automatic sleep stage scoring with single-channel EEG using convolutional neural networks. arXiv preprint arXiv:1610.01683 (2016)

  24. Wang, S., Chang, X., Li, X., Long, G., Yao, L., Sheng, Q.Z.: Diagnosis code assignment using sparsity-based disease correlation embedding. IEEE Trans. Knowl. Data Eng. 28(12), 3191–3202 (2016)

    Article  Google Scholar 

  25. Zhang, D., Yao, L., Zhang, X., Wang, S., Chen, W., Boots, R.: EEG-based intention recognition from spatio-temporal representations via cascade and parallel convolutional recurrent neural networks. arXiv preprint arXiv:1708.06578 (2017)

  26. Zhang, X., Yao, L., Huang, C., Sheng, Q.Z., Wang, X.: Intent recognition in smart living through deep recurrent neural networks. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, E.S. (eds.) ICONIP 2017. LNCS, vol. 10635, pp. 748–758. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-70096-0_76

    Chapter  Google Scholar 

  27. Zhang, X., Yao, L., Sheng, Q.Z., Kanhere, S.S., Gu, T., Zhang, D.: Converting your thoughts to texts: enabling brain typing via deep feature learning of EEG signals. In: 2018 IEEE International Conference on Pervasive Computing and Communications (PerCom), pp. 1–10. IEEE (2018)

    Google Scholar 

Download references

Acknowledgement

This research has been supported by the Fundamental Research Funds for the Central Universities under Grant No. 2412019FZ047, the China Postdoctoral Science Foundation under Grant No. 2017M621192, the National Natural Science Foundation of China (NSFC) under Grant No.61972384, and the Outstanding Sino-foreign Youth Exchange Program of China Association for Science and Technology.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaowei Zhao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yue, L., Tian, D., Jiang, J., Yao, L., Chen, W., Zhao, X. (2021). Intention Recognition from Spatio-Temporal Representation of EEG Signals. In: Qiao, M., Vossen, G., Wang, S., Li, L. (eds) Databases Theory and Applications. ADC 2021. Lecture Notes in Computer Science(), vol 12610. Springer, Cham. https://doi.org/10.1007/978-3-030-69377-0_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-69377-0_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-69376-3

  • Online ISBN: 978-3-030-69377-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics