Skip to main content

Replacing EEG Sensors by AI Based Emulation

  • Conference paper
  • First Online:
Book cover Augmented Reality, Virtual Reality, and Computer Graphics (AVR 2021)

Abstract

Electroencephalography (EEG) has become a widely used non-invasive measurement method for brain-computer interfaces (BCI). Hybrid BCI (hBCI) additionally incorporate other physiological indicators, also called bio-signals, in order to improve the decryption of brain signals evaluating a variety of different sensor data. Although significant progress has been made in the field of BCI, the correlation of data from different sensors as well as the possible redundancy of certain sensors have been less frequently studied. Based on deep learning our concept presents a theoretical approach to potentially replace one sensor with the measurements of others. Hence, a costly or difficult to sensor measurement could be left out of a setup completely without losing its functionality. In this context, we additionally propose a conceptual framework which facilitates and improves the generation of scientifically significant data through their collection within a corresponding VR application and set-up. The evaluation of these collected sensor data, which is described in five consecutive steps, is to cluster the data of one sensor and to classify the data from other sensors into these clusters. Afterwards, the sensor data in each cluster are analysed for patterns. Through the predictive data analysis of existing sensors, the required number of sensors can be reduced. This allows valid statements about the output of the original sensor with no need to use it effectively. An artificial intelligence (AI) based EEG emulation, derived from other directly related bio-signals, could therefore potentially replace EEG measurements which indirectly enables the use of BCI in situations where it was previously not possible. Future work might clarify relevant questions concerning the realisation of the concept and how it could be further developed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Abiri, R., Borhani, S., Sellers, E.W., Jiang, Y., Zhao, X.: A comprehensive review of EEG-based brain-computer interface paradigms. J. Neural Eng. 16(1) (2018). https://doi.org/10.1088/1741-2552/aaf12e

  2. Agarwal, A.A., Munigala, V., Ramamritham, K.: Observability: replacing sensors with inference engines. In: Proceedings of the Seventh International Conference on Future Energy Systems Poster Sessions, pp. 9:1–9:2. ACM (2016)

    Google Scholar 

  3. Ali, M., Al Machot, F., Haj Mosa, A., Jdeed, M., Al Machot, E., Kyamakya, K.: A globally generalized emotion recognition system involving different physiological signals. Sensors 18(6) (2018). https://doi.org/10.3390/s18061905

  4. Allison, B., Dunne, S., Leeb, R., Millán, J., Nijholt, A.: Towards Practical Brain-Computer Interfaces: Bridging the Gap from Research to Real-World Applications. Biological and Medical Physics, Biomedical Engineering. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-29746-5

  5. Alonso-Valerdi, L., Gutiérrez-Begovich, D., Argüello-García, J., Sepulveda, F., Ramírez-Mendoza, R.: User experience may be producing greater heart rate variability than motor imagery related control tasks during the user-system adaptation in brain-computer interfaces. Front. Physiol. 7 (2016). https://doi.org/10.3389/fphys.2016.00279

  6. Amiri, S., Fazel-Rezai, R., Asadpour, V.: A review of hybrid brain-computer interface systems. Adv. Hum.-Comput. Interact. 2013, 1–8 (2013). https://doi.org/10.1155/2013/187024

    Article  Google Scholar 

  7. Anthes, C., García-Hernández, R., Wiedemann, M., Kranzlmüller, D.: State of the art of virtual reality technology. In: 2016 IEEE Aerospace Conference, pp. 1–19. IEEE (2016). https://doi.org/10.1109/AERO.2016.7500674

  8. Bishop, C.: Neural Networks for Pattern Recognition. Oxford University Press, Oxford (1995)

    MATH  Google Scholar 

  9. Deller, A.: Brain-computer interfaces: The next step in human evolution: The merging of humanity with the technology we have created has begun\(\ldots \) (2020). https://www.wevolver.com/article/brain-computer-interfaces-the-next-step-in-human-evolution. Accessed 31 May 2021

  10. Ekman, P.: Basic emotions. In: Handbook of Cognition and Emotion, vol. 98, pp. 45–60 (1999)

    Google Scholar 

  11. Gao, Z., Cheng, W., Qiu, X., Meng, L.: A missing sensor data estimation algorithm based on temporal and spatial correlation. Int. J. Distrib. Sensor Netw. 11(10) (2015). https://doi.org/10.1155/2015/435391

  12. Hochberg, L., et al.: Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature 485(7398), 372–375 (2012). https://doi.org/10.1038/nature11076

    Article  Google Scholar 

  13. HP Development Company, L.P.: HP omnicept & HP reverb G2 omnicept edition (2021). https://www8.hp.com/us/en/vr/reverb-g2-vr-headset-omnicept-edition.html. Accessed 31 May 2021

  14. Kanjo, E., Younis, E.M., Ang, C.S.: Deep learning analysis of mobile physiological, environmental and location sensor data for emotion detection. Inf. Fusion 49, 46–56 (2019). https://doi.org/10.1016/j.inffus.2018.09.001

    Article  Google Scholar 

  15. Kanjo, E., Younis, E.M., Sherkat, N.: Towards unravelling the relationship between on-body, environmental and emotion data using sensor information fusion approach. Inf. Fusion 40, 18–31 (2018). https://doi.org/10.1016/j.inffus.2017.05.005

    Article  Google Scholar 

  16. Kim, S., Simeral, J., Hochberg, L., Donoghue, J., Friehs, G., Black, M.: Point-and-click cursor control with an intracortical neural interface system by humans with tetraplegia. IEEE Trans. Neural Syst. Rehabil. Eng. 19(2), 193–203 (2011). https://doi.org/10.1109/TNSRE.2011.2107750

    Article  Google Scholar 

  17. Kory, J., D’Mello, S.: Affect elicitation for affective computing. In: The Oxford Handbook of Affective Computing, pp. 371–383. Oxford Library of Psychology (2015)

    Google Scholar 

  18. Lotte, F., Nam, C., Nijholt, A.: Introduction: evolution of brain-computer interfaces. In: Brain-Computer Interfaces Handbook: Technological and Theoretical Advance, pp. 1–11. Taylor & Francis (CRC Press) (2018)

    Google Scholar 

  19. Marín-Morales, J., et al.: Real vs. immersive-virtual emotional experience: analysis of psycho-physiological patterns in a free exploration of an art museum. PloS One 14(10) (2019). https://doi.org/10.1371/journal.pone.0223881

  20. Marín-Morales, J., et al.: Affective computing in virtual reality: emotion recognition from brain and heartbeat dynamics using wearable sensors. Sci. Rep. 8(1), 1–15 (2018)

    Article  Google Scholar 

  21. Marín-Morales, J., Llinares, C., Guixeres, J., Alcañiz, M.: Emotion recognition in immersive virtual reality: from statistics to affective computing. Sensors 20(18), 5163 (2020). https://doi.org/10.3390/s20185163

    Article  Google Scholar 

  22. Müller-Putz, G., et al.: Principles of hybrid brain-computer interfaces. In: Allison, B., Dunne, S., Leeb, R., Del, R., Millán, J., Nijholt, A. (eds.) Towards Practical Brain-Computer Interfaces. BIOMEDICAL, pp. 355–373. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-29746-5_18

  23. Müller-Putz, G., et al.: Towards noninvasive hybrid brain-computer interfaces: framework, practice, clinical application, and beyond. Proc. IEEE 103(6), 926–943 (2015). https://doi.org/10.1109/JPROC.2015.2411333

    Article  Google Scholar 

  24. Nicolas-Alonso, L., Gomez-Gil, J.: Brain computer interfaces, a review. Sensors 12(2), 1211–1279 (2012). https://doi.org/10.3390/s120201211

    Article  Google Scholar 

  25. Orefice, P.-H., Ammi, M., Hafez, M., Tapus, A.: Design of an emotion elicitation tool using VR for human-avatar interaction studies. In: IVA 2017. LNCS (LNAI), vol. 10498, pp. 335–338. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-67401-8_42

    Chapter  Google Scholar 

  26. Roberts, N., Tsai, J., Coan, J.: Emotion elicitation using dyadic interaction tasks. In: Handbook of Emotion Elicitation and Assessment, pp. 106–123. Oxford University Press, New York (2007)

    Google Scholar 

  27. Slutzky, M.W., Flint, R.D.: Physiological properties of brain-machine interface input signals. J. Neurophysiol. 118(2), 1329–1343 (2017). https://doi.org/10.1152/jn.00070.2017

    Article  Google Scholar 

  28. Susindar, S., Sadeghi, M., Huntington, L., Singer, A., Ferris, T.: The feeling is real: emotion elicitation in virtual reality. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 63, pp. 252–256. SAGE Publications Sage CA, Los Angeles (2019). https://doi.org/10.1177/1071181319631509

  29. Tauscher, J., Schottky, F., Grogorick, S., Bittner, P., Mustafa, M., Magnor, M.: Immersive EEG: evaluating electroencephalography in virtual reality. In: IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 1794–1800. IEEE (2019). https://doi.org/10.1109/VR.2019.8797858

  30. Tayeh, G., Makhoul, A., Perera, C., Demerjian, J.: A spatial-temporal correlation approach for data reduction in cluster-based sensor networks. IEEE Access 7, 50669–50680 (2019). https://doi.org/10.1109/ACCESS.2019.2910886

    Article  Google Scholar 

  31. Taylor, D.M., Tillery, S.I.H., Schwartz, A.B.: Direct cortical control of 3D neuroprosthetic devices. Science 296(5574), 1829–1832 (2002). https://doi.org/10.1126/science.1070291

    Article  Google Scholar 

  32. Velliste, M., Perel, S., Spalding, M., Whitford, A., Schwartz, A.: Cortical control of a prosthetic arm for self-feeding. Nature 453(7198), 1098–1101 (2008). https://doi.org/10.1038/nature06996

    Article  Google Scholar 

  33. Vogel, J., et al.: An assistive decision-and-control architecture for force-sensitive hand-arm systems driven by human-machine interfaces. Int. J. Robot. Res. 34(6), 763–780 (2015). https://doi.org/10.1177/0278364914561535

    Article  Google Scholar 

  34. Waldert, S., Pistohl, T., Braun, C., Ball, T., Aertsen, A., Mehring, C.: A review on directional information in neural signals for brain-machine interfaces. J. Physiol. 103(3–5), 244–254 (2009). https://doi.org/10.1016/j.jphysparis.2009.08.007

    Article  Google Scholar 

  35. Weibel, R., et al.: Virtual reality experiments with physiological measures. J. Visualized Exp. JoVE (138) (2018). https://doi.org/10.3791/58318

  36. Wilkinson, M., Pugh, Z., Crowson, A., Feng, J., Mayhorn, C., Gillan, D.: Seeing in slow motion: manipulating arousal in virtual reality. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 63, pp. 1649–1653. SAGE Publications Sage CA, Los Angeles (2019)

    Google Scholar 

  37. Yan, X., Xie, H., Tong, W.: A multiple linear regression data predicting method using correlation analysis for wireless sensor networks. In: Proceedings of 2011 Cross Strait Quad-Regional Radio Science and Wireless Technology Conference, vol. 2, pp. 960–963. IEEE (2011). https://doi.org/10.1109/CSQRWC.2011.6037116

  38. Zhang, X., Yao, L., Wang, X., Monaghan, J., McAlpine, D., Zhang, Y.: A survey on deep learning-based non-invasive brain signals: recent advances and new frontiers. J. Neural Eng. 18(3) (2021). https://doi.org/10.1088/1741-2552/abc902

Download references

Acknowledgement

We would like to thank Thomas Odaker, Elisabeth Mayer and Lea Weil who supported this work with helpful discussions and feedback.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fabio Genz .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Genz, F., Hufeld, C., Müller, S., Kolb, D., Starck, J., Kranzlmüller, D. (2021). Replacing EEG Sensors by AI Based Emulation. In: De Paolis, L.T., Arpaia, P., Bourdot, P. (eds) Augmented Reality, Virtual Reality, and Computer Graphics. AVR 2021. Lecture Notes in Computer Science(), vol 12980. Springer, Cham. https://doi.org/10.1007/978-3-030-87595-4_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-87595-4_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-87594-7

  • Online ISBN: 978-3-030-87595-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics