Skip to main content

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13306))

Included in the following conference series:

Abstract

Measuring biometric information helps us estimate the users’ excitement degree and their negative and positive emotions. By measuring a person’s biometric information while experiencing the virtual reality (VR), it is possible to interactively change the content according to the estimated emotional state of the person. However, the hassle and discomfort of wearing the sensor interferes with the VR experience, and the body motion caused by the VR experience prevents accurate measurement. Therefore, some studies have developed devices that incorporate biometric measurement sensors into the head mounted displays (HMDs). Since we use HMDs by pressing them against our faces, biometric sensing by HMDs is resistant to body movements and can reduce the discomfort of sensor attachment. This paper introduces our research on HMDs with embedded sensors and our previous study as part of this project. This paper introduces the various biological sensing HMDs including our research and discusses VR applications using those HMDs.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bring to Light. https://store.steampowered.com/app/636720/Bring_to_Light/. Accessed 30 May 2019

  2. HP Omnicept & HP Reverb G2 Omnicept Edition. https://www.hp.com/us-en/vr/reverb-g2-vr-headset-omnicept-edition.html. Accessed 6 Feb 2022

  3. HTC Vive Pro Eye. https://www.vive.com/us/product/vive-pro-eye/overview/. Accessed 11 Feb 2022

  4. NeU-VR. https://neu-brains.co.jp/solution/neuro-marketing/neu-vr/. Accessed 6 Feb 2022

  5. NextMind. https://www.next-mind.com/. Accessed 11 Feb 2022

  6. Agrafioti, F., Hatzinakos, D., Anderson, A.K.: ECG pattern analysis for emotion detection. IEEE Trans. Affect. Comput. 3(1), 102–115 (2012). https://doi.org/10.1109/T-AFFC.2011.28

    Article  Google Scholar 

  7. Al-Khalidi, F.Q., et al.: Respiration rate monitoring methods: a review. Pediatric Pulmonol. 46(6), 523–529 (2011)

    Google Scholar 

  8. Billman, G.E.: Heart rate variability-a historical perspective. Front. Physiol. 2, 86 (2011)

    Article  Google Scholar 

  9. Bradley, M.M., Miccoli, L., Escrig, M.A., Lang, P.J.: The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45(4), 602–607 (2008)

    Article  Google Scholar 

  10. Chandon, P., Hutchinson, J., Bradlow, E., Young, S.H.: Measuring the value of point-of-purchase marketing with commercial eye-tracking data. INSEAD Business School Research Paper (2007/22) (2006)

    Google Scholar 

  11. Gupta, K., Lee, G.A., Billinghurst, M.: Do you see what I see? The effect of gaze tracking on task space remote collaboration. IEEE Trans. Visual Comput. Graph. 22(11), 2413–2422 (2016)

    Article  Google Scholar 

  12. Hernandez, J., et al.: BioGlass: physiological parameter estimation using a head-mounted wearable device. In: 2014 4th International Conference on Wireless Mobile Communication and Healthcare-Transforming Healthcare Through Innovations in Mobile and Wireless Technologies (MOBIHEALTH), pp. 55–58. IEEE (2014)

    Google Scholar 

  13. Hsu, Y.L., Wang, J.S., Chiang, W.C., Hung, C.H.: Automatic ECG-based emotion recognition in music listening. IEEE Trans. Affect. Comput. 11(1), 85–99 (2017)

    Article  Google Scholar 

  14. Hua, H.: Integration of eye tracking capability into optical see-through head-mounted displays. In: Stereoscopic Displays and Virtual Reality Systems VIII, vol. 4297, pp. 496–503. SPIE (2001)

    Google Scholar 

  15. Inazawa, M., Ban, Y.: Development of easy attachable biological information measurement device for various head mounted displays. In: 2019 International Conference on Cyberworlds (CW), pp. 1–8. IEEE (2019)

    Google Scholar 

  16. Inazawa, M., Hu, X., Ban, Y.: Biofeedback interactive VR system using biological information measurement HMD. In: SIGGRAPH Asia 2019 Emerging Technologies, SA 2019, pp. 5–6. Association for Computing Machinery, New York (2019). https://doi.org/10.1145/3355049.3360523

  17. Ito, K., et al.: Evaluation of “dokidoki feelings” for a VR system using ECGs with comparison between genders. In: 2017 International Conference on Biometrics and Kansei Engineering (ICBAKE), pp. 110–114. IEEE (2017)

    Google Scholar 

  18. Juliano, J.M., et al.: Embodiment is related to better performance on a brain-computer interface in immersive virtual reality: a pilot study. Sensors 20(4), 1204 (2020)

    Article  Google Scholar 

  19. Kassner, M., Patera, W., Bulling, A.: Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, pp. 1151–1160 (2014)

    Google Scholar 

  20. Kodama, R., et al.: A context recognition method using temperature sensors in the nostrils. In: Proceedings of the 2018 ACM International Symposium on Wearable Computers, ISWC 2018, pp. 220–221 (2018). https://doi.org/10.1145/3267242.3267261

  21. Li, B., Cheng, T., Guo, Z.: A review of EEG acquisition, processing and application. J. Phys. Conf. Ser. 1907, 012045 (2021)

    Article  Google Scholar 

  22. Luong, T., Martin, N., Raison, A., Argelaguet, F., Diverrez, J.M., Lécuyer, A.: Towards real-time recognition of users mental workload using integrated physiological sensors into a VR HMD. In: 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 425–437. IEEE (2020)

    Google Scholar 

  23. Mohan, P.M., Nagarajan, V., Das, S.R.: Stress measurement from wearable photoplethysmographic sensor using heart rate variability data. In: 2016 International Conference on Communication and Signal Processing (ICCSP), pp. 1141–1144. IEEE (2016)

    Google Scholar 

  24. Pai, Y.S., Dingler, T., Kunze, K.: Assessing hands-free interactions for VR using eye gaze and electromyography. Virtual Reality 23(2), 119–131 (2019)

    Article  Google Scholar 

  25. Rattanyu, K., Mizukawa, M.: Emotion recognition based on ECG signals for service robots in the intelligent space during daily life. J. Adv. Comput. Intell. Intell. Inform. 15(5), 582–591 (2011). https://doi.org/10.20965/jaciii.2011.p0582

    Article  Google Scholar 

  26. Song, G., Cai, J., Cham, T.J., Zheng, J., Zhang, J., Fuchs, H.: Real-time 3D face-eye performance capture of a person wearing VR headset. In: Proceedings of the 26th ACM International Conference on Multimedia, pp. 923–931 (2018)

    Google Scholar 

  27. Sullivan, C., et al.: The effect of virtual reality during dental treatment on child anxiety and behavior. ASDC J. Dent. Child. 67(3), 193–6 (2000)

    Google Scholar 

  28. Taelman, J., et al.: Influence of mental stress on heart rate and heart rate variability. In: Vander Sloten, J., Verdonck, P., Nyssen, M., Haueisen, J. (eds.) 4th European Conference of the International Federation for Medical and Biological Engineering. IFMBE, vol. 22, pp. 1366–1369. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-540-89208-3_324

  29. Tamura, T., Maeda, Y., Sekine, M., Yoshida, M.: Wearable photoplethysmographic sensors-past and present. Electronics 3(2), 282–302 (2014)

    Article  Google Scholar 

  30. Tanida, M., Katsuyama, M., Sakatani, K.: Effects of fragrance administration on stress-induced prefrontal cortex activity and sebum secretion in the facial skin. Neurosci. Lett. 432(2), 157–161 (2008)

    Article  Google Scholar 

  31. Tanikawa, T., Shiozaki, K., Ban, Y., Aoyama, K., Hirose, M.: Semi-automatic reply avatar for VR training system with adapted scenario to trainee’s status. In: Stephanidis, C., et al. (eds.) HCII 2021. LNCS, vol. 13095, pp. 350–355. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-90963-5_26

    Chapter  Google Scholar 

  32. Tauscher, J.P., Schottky, F.W., Grogorick, S., Bittner, P.M., Mustafa, M., Magnor, M.: Immersive EEG: evaluating electroencephalography in virtual reality. In: 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 1794–1800. IEEE (2019)

    Google Scholar 

  33. Ueoka, R., AlMutawa, A.: Emotion hacking VR: amplifying scary VR experience by accelerating actual heart rate. In: Yamamoto, S., Mori, H. (eds.) HIMI 2018. LNCS, vol. 10904, pp. 436–445. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-92043-6_37

    Chapter  Google Scholar 

  34. Villringer, A., Planck, J., Hock, C., Schleinkofer, L., Dirnagl, U.: Near infrared spectroscopy (NIRS): a new tool to study hemodynamic changes during activation of brain function in human adults. Neurosci. Lett. 154(1–2), 101–104 (1993)

    Article  Google Scholar 

  35. Vourvopoulos, A., Niforatos, E., Giannakos, M.: EEGlass: an EEG-Eyeware prototype for ubiquitous brain-computer interaction. In: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, pp. 647–652 (2019)

    Google Scholar 

  36. Zenju, H.: The estimation of unpleasant and pleasant states by nasal thermogram. In: Forum on Information Technology, vol. 3, pp. 459–460 (2002)

    Google Scholar 

  37. Zhang, Q., et al.: Respiration-based emotion recognition with deep learning. Comput. Ind. 92, 84–90 (2017)

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported by Council for Science, Technology and Innovation, Cross-ministerial Strategic Innovation Promotion Program (SIP), Big-data and AI-enabled Cyberspace Technologies. (funding agency: NEDO)

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yuki Ban .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ban, Y., Inazawa, M. (2022). Development of the Biological Sensing Head Mounted Display. In: Yamamoto, S., Mori, H. (eds) Human Interface and the Management of Information: Applications in Complex Technological Environments. HCII 2022. Lecture Notes in Computer Science, vol 13306. Springer, Cham. https://doi.org/10.1007/978-3-031-06509-5_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-06509-5_22

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-06508-8

  • Online ISBN: 978-3-031-06509-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics