Skip to main content

A Hybrid Method Using Gaze and Controller for Targeting Tiny Targets in VR While Lying down

  • Conference paper
  • First Online:
HCI International 2023 Posters (HCII 2023)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1836))

Included in the following conference series:

  • 1076 Accesses

Abstract

Virtual reality (VR) head-mounted displays (HMDs) can be used in a variety of body postures, including standing, sitting, and lying down (supine). If VRHMD can be used effectively even from the supine position, it will not only allow bedridden people to comfortably access VR content, but also evolve VRHMDs into familiar devices that can be used in any body position, like smartphones. Though existing interaction methods for VR, such as moving the controller in the air, are assumed to be used while standing or sitting, the gaze input interface is a promising method that is available while supine. However, there are some issues unique to gaze input. It is unsuitable for fine pointing because the estimated gaze position movers continually, even if the user tries to gazes at a particular target. We propose a hybrid method that uses both gaze and the controller to overcome this issue. The proposed method combines controller rotation, a single button, and gaze input from an eye tracker. An experiment is conducted to compare three methods: the proposed method, a method using only gaze, and a method using only the controller. The experiment included the task of clicking tiny buttons that required high pointing resolution. The results show that when the eye tracker works as intended, the proposed method has an input speed almost equal to or faster than the method that uses only the controller, and it is superior to the method that uses only the eye tracker.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://uploadvr.com/quest-2-sold-almost-15-million-idc/.

  2. 2.

    https://steamcharts.com/app/471710.

References

  1. Abe, K., Ishikawa, H., Manabe, H.: Eye gaze input performance when using VRHMD with eye tracker while lying down (written in Japanese). In: Proceedings of INTERACTION 2023, pp. 497–501 (2023)

    Google Scholar 

  2. Abe, K., Manabe, H.: Proposal of a method for comfortable text input while lying down using a VR controller (written in in Japanese). In: Proceedings of INTERACTION 2022. pp. 784–787 (2022)

    Google Scholar 

  3. Gerber, S.M., et al.: Visuo-acoustic stimulation that helps you to relax: a virtual reality setup for patients in the intensive care unit. Sci. Rep. 7(1), 13228 (2017). https://doi.org/10.1038/s41598-017-13153-1

  4. Jacob, R.J.K.: What you look at is what you get: eye movement-based interaction techniques. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 11–18. CHI 1990, Association for Computing Machinery, New York, NY, USA (1990). https://doi.org/10.1145/97243.97246

  5. Kai, Y., et al.: Evaluation of a remote-controlled drone system for bedridden patients using their eyes based on clinical experiment. Technologies. 11(1) (2023). https://doi.org/10.3390/technologies11010015, https://www.mdpi.com/2227-7080/11/1/15

  6. Kakinuma, I., Komiyama, S.: A study on 2D pointing method using gaze and a controller device in VR (written in Japanese). The 18th Forum on Information technology(FIT2019) Proceedings 3, pp. 281–286 (2019)

    Google Scholar 

  7. Kakinuma, I., Komiyama, S.: The effect of using a controller to adjust gaze pointing in VR space (written in Japanese). Trans. Human Interface. 23(1), 89–100 (2021). https://doi.org/10.11184/his.23.1_89

  8. Keskin, C., Balci, K., Aran, O., Sankur, B., Akarun, L.: A multimodal 3d healthcare communication system. In: 2007 3DTV Conference, pp. 1–4 (May 2007). https://doi.org/10.1109/3DTV.2007.4379488

  9. Kwon, D., Choi, H., Jun Cho, H., Lee, J., Kim, G.: PillowVR: virtual reality in bed. In: Proceedings of the 25th ACM Symposium on Virtual Reality Software and Technology. VRST 2019, Association for Computing Machinery, New York, NY, USA (2019). https://doi.org/10.1145/3359996.3365029

  10. Lee, S., Zhai, S.: The performance of touch screen soft buttons. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 309–318. CHI 2009, Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1518701.1518750

  11. Mohan, P., Goh, W.B., Fu, C.W., Yeung, S.K.: Dualgaze: addressing the Midas touch problem in gaze mediated VR interaction. In: 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), pp. 79–84 (2018). https://doi.org/10.1109/ISMAR-Adjunct.2018.00039

  12. Piumsomboon, T., Lee, G., Lindeman, R.W., Billinghurst, M.: Exploring natural eye-gaze-based interaction for immersive virtual reality. In: 2017 IEEE Symposium on 3D User Interfaces (3DUI), pp. 36–39 (2017). https://doi.org/10.1109/3DUI.2017.7893315

  13. Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychol. Bull. 124(3), 372–422 (1998)

    Article  Google Scholar 

  14. Rehg, J., Kanade, T.: Digiteyes: vision-based hand tracking for human-computer interaction. In: Proceedings of 1994 IEEE Workshop on Motion of Non-rigid and Articulated Objects, pp. 16–22, November 1994. https://doi.org/10.1109/MNRAO.1994.346260

  15. Sibert, L.E., Jacob, R.J.K.: Evaluation of eye gaze interaction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 281–288. CHI 2000, Association for Computing Machinery, New York, NY, USA (2000). https://doi.org/10.1145/332040.332445

  16. Zhai, S., Morimoto, C., Ihde, S.: Manual and gaze input cascaded (magic) pointing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 246–253. CHI 1999, Association for Computing Machinery, New York, NY, USA (1999). https://doi.org/10.1145/302979.303053

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kouga Abe .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Abe, K., Ishikawa, H., Manabe, H. (2023). A Hybrid Method Using Gaze and Controller for Targeting Tiny Targets in VR While Lying down. In: Stephanidis, C., Antona, M., Ntoa, S., Salvendy, G. (eds) HCI International 2023 Posters. HCII 2023. Communications in Computer and Information Science, vol 1836. Springer, Cham. https://doi.org/10.1007/978-3-031-36004-6_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-36004-6_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-36003-9

  • Online ISBN: 978-3-031-36004-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics