Skip to main content
Log in

Exploring the design space of eyes-free target acquisition in virtual environments

  • Original Article
  • Published:
Virtual Reality Aims and scope Submit manuscript

Abstract

Supporting smooth target acquisition is an important objective in immersive virtual reality (VR) environments. However, users were obliged to search for objects relying on the vision channel in traditional VR systems. Such eyes-engaged technologies may significantly degrade the interaction efficiency and user experience, particularly when users have to turn their head frequently to search for virtual objects in the limited field of view of a head-mounted display. In this paper, we report a two-stage study which investigates the capability of VR users to acquire spatial targets without eye engagement (i.e., eyes-free target acquisition). First, we measure the eyes-free performance of users in terms of control accuracy and subjective task load. Second, we evaluate the effects of eyes-free acquisition on memory capacity, spatial offset, and task completion time in the context of a VR game. Starting from a set of 54 spatial positions, we identify 18 optimal locations (half on the left side of the user’s body and half on the right) that allow both accurate and comfortable target acquisition without visual attention. After a short training period, users could accurately and quickly acquire 17 targets in a VR game with an average offset of 10.5 cm and an average completion time of 2.7 s. According to our results, we suggest how to optimize the spatial layout, number of targets, target locations, and interaction techniques for eyes-free acquisition in VR applications. Our work can serve as a foundation for future development of eyes-free methods of target acquisition in VR.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  • Argelaguet F, Andujar C (2013) A survey of 3D object selection techniques for virtual environments. Comput Graph 37(3):121–136

    Article  Google Scholar 

  • Caluya NR, Plopski A, Ty JF, Sandor C, Taketomi T, Kato H (2018) Transferability of spatial maps: augmented versus virtual reality training. In: 2018 IEEE conference on virtual reality and 3D user interfaces, pp 387–393. IEEE

  • Cheng C, Jiang R, Dong XM (2012) Human knowledge acquisition from 3D interaction in virtual environments. Sci China Inform Sci 55:1528–1540

    Article  Google Scholar 

  • Cockburn A (2004) Revisiting 2D vs 3D implications on spatial memory. In: Proceedings of the fifth conference on Australasian user interface, 28, pp 25–31. Australian Computer Society, Inc.

  • Cockburn A, Quinn P, Gutwin C, Ramos G, Looser J (2011) Air pointing: design and evaluation of spatial target acquisition with and without visual feedback. Int J Hum Comput Stud 69(6):401–414

    Article  Google Scholar 

  • Cockburn A, McKenzie B (2002) Evaluating the effectiveness of spatial memory in 2D and 3D physical and virtual environments. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 203–210, ACM

  • De Lillo C, James FC (2012) Spatial working memory for clustered and linear configurations of sites in a virtual reality foraging task. Cognit Process 13(1):243–246

    Article  Google Scholar 

  • Deng CL, Geng P, Hu YF, Kuai SG (2019) Beyond Fitts’s law: a three-phase model predicts movement time to position an object in an immersive 3D virtual environment. Hum Factors 61(6):879–894

    Article  Google Scholar 

  • Edge D, Blackwell AF (2009) Peripheral tangible interaction by analytic design. In: Proceedings of the 3rd international conference on tangible and embedded interaction, February, pp 69–76. ACM

  • Fine JM, Ward KL, Amazeen EL (2014) Manual coordination with intermittent targets: velocity information for prospective control. Acta Psycholog 149:24–31

    Article  Google Scholar 

  • Forsberg A, Herndon K, Zeleznik R (1996) Aperture based selection for immersive virtual environments. In: Proceedings of the 9th annual ACM symposium on user interface software and technology, pp 95–96. ACM

  • Gacem H, Bailly G, Eagan J, Lecolinet E (2016) Impact of motorized projection guidance on spatial memory. In: Proceedings of the 2016 symposium on spatial user interaction, pp 51–59

  • Gold T (1972) The limits of stereopsis for depth perception in dynamic visual situations. In: Engineering data compendium. Human perception and performance. Edited by Kenneth R. Boff and Janet E. Lincoln. Volume 2. Chapter 5. pp 1053–1128

  • Grossman T, Balakrishnan R (2005) The bubble cursor: Enhancing target acquisition by dynamic resizing of the cursor’s activation area. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 281–290. ACM

  • Hart SG, Staveland LE (1988) Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. Adv Psychol 52:139–183

    Article  Google Scholar 

  • Henry D, Furness T (1993) Spatial perception in virtual environments: Evaluating an architectural application. In: Proceedings of IEEE virtual reality annual international symposium, pp 33–40. IEEE

  • HTC. https://www.vive.com/cn/product/vive-pro-full-kit/. Retrieved May 28. 2020.

  • Kölsch M, Beall AC, Turk M (2003) The postural comfort zone for reaching gestures. In: Proceedings of the human factors and ergonomics society annual meeting, 47(4), 787–791. SAGE Publications

  • Anacken LV, Grossman T, Coninx K (2007) Exploring the effects of environment density and target visibility on object selection in 3D virtual environments. In: 2007 IEEE symposium on 3D user interfaces. IEEE

  • LaViola JJ Jr, Kruijff E, McMahan RP, Bowman DA, Poupyrev IP (2017)“3D user interfaces: theory and practice, 2nd ed. Addison-Wesley Professional

  • Li FCY, Dearman D, Truong KN (2009) Virtual shelves: interactions with orientation aware devices. In: Proceedings of the 22nd annual ACM symposium on user interface software and technology, pp 125–128

  • MacLean A, Young RM, Bellotti VME, Moran TP (1991) Questions, options, and criteria: elements of design space analysis. Hum-Comput Interact 6(3):201–250

    Article  Google Scholar 

  • Matheis RJ, Schultheis MT, Tiersky LA, DeLuca J, Millis SR, Rizzo A (2007) Is learning and memory different in a virtual environment? Clin Neuropsychol 21(1):146–161

    Article  Google Scholar 

  • Mathews M, Challa M, Chu CT, Jian G, Seichter H, Grasset R (2007) Evaluation of spatial abilities through tabletop AR. In: Proceedings of the 8th ACM SIGCHI New Zealand chapter’s international conference on computer-human interaction: design centered HCI, July, pp 17–24

  • McMahan RP, Kopper R, Bowman DA (2015) Principles for designing effective 3D interaction techniques. Handbook of virutal environments: design, implementation, and applications. 2nd ed., K. Hale and Stanney, K. CRC Press, Boca Raton, FL, pp 285–311

  • Penumudi SA, Kuppam VA, Kim JH, Hwang J (2020) The effects of target location on musculoskeletal load, task performance, and subjective discomfort during virtual reality interactions. Appl Ergon 84:103010

    Article  Google Scholar 

  • Pierce JS, Forsberg AS, Conway MJ, Hong S, Zeleznik RC, Mine MR (1997) Image plane interaction techniques in 3D immersive environments. In: Proceedings of the 1997 symposium on interactive 3D graphics, pp 39–ff. ACM

  • Ragan ED, Scerbo S, Bacim F, Bowman DA (2017) Amplified head rotation in virtual reality and the effects on 3D search, training transfer, and spatial orientation. IEEE Trans Visual Comput Graph 23(8):1880–1895

    Article  Google Scholar 

  • Ramcharitar A, Teather RJ (2018) “EZCursorVR: 2D selection with virtual reality head-mounted displays. In: Proceedings of the 44th graphics interface conference, June, pp 123–130, ACM

  • Tavanti M, Lind M (2001) 2D vs 3D, implications on spatial memory. In: IEEE symposium on information visualization, pp 139–145. IEEE

  • Tu H, Huang S, Yuan J, Ren X, Tian F (2019) Crossing-based selection with virtual reality head-mounted displays. In: Proceedings of the 2019 CHI conference on human factors in computing systems, pp 1–14. ACM

  • Vogel D, Balakrishnan R (2005) Distant freehand pointing and clicking on very large, high resolution displays. In: Proceedings of the 18th annual ACM symposium on user interface software and technology, pp 33–42

  • Ware C, Lowther K (1997) Selection using a one-eyed cursor in a fish tank VR environment. ACM Trans Comput-Hum Interact 4(4):309–322

    Article  Google Scholar 

  • Wickens CD, Olmos O, Chudy A, Davenport C (1997) Aviation display support for situation awareness,” (No. ARL-97–10/LOGICON-97–2). University of Illinois at Urbana-Champaign Aviation Research Lab.

  • Wilcox LM, Allison RS, Elfassy S, Grelik C (2006) Personal space in virtual reality. ACM Trans Appl Percept 3(4):412–428

    Article  Google Scholar 

  • Worden A, Walker N, Bharat K, Hudson S (1997) Making computers easier for older adults to use: area cursors and sticky icons. In: Proceedings of the ACM SIGCHI conference on human factors in computing systems, pp 266–271. ACM

  • Wu HY, Yang LQQ (2019) User-defined gestures for dual-screen mobile devices. Int J Hum-Comput Interact 36(10):978–992. https://doi.org/10.1080/10447318.2019.1706331

    Article  Google Scholar 

  • Xiao R, Benko H (2016) Augmenting the field-of-view of head-mounted displays with sparse peripheral displays. In: Proceedings of the 2016 CHI conference on human factors in computing systems, pp 1221–1232. ACM

  • Yan Y, Yu C, Ma X, Huang S, Iqbal H, Shi Y (2018) Eyes-free target acquisition in interaction space around the body for virtual reality. In: Proceedings of the 2018 CHI conference on human factors in computing systems, pp 1–13

  • Zhou Q, Yu D, Reinoso MN, Newn J, Goncalves J, Velloso E (2020) Eyes-free target acquisition during walking in immersive mixed reality. IEEE Trans Visual Comput Graph 26(12):3423–3433

    Article  Google Scholar 

Download references

Acknowledgements

The authors wish to thank the anonymous reviewers for their insightful comments. We also want to thank Wanying Feng, Wanxue Xu, and Jinfang Zhang for their help in user interviews and data collection. This work was supported by the National Natural Science Foundation of China under Grant No. 61772564 and the Guangdong Basic and Applied Basic Research Foundation under Grant No. 2021A1515011990.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Huiyue Wu.

Ethics declarations

Conflict of interest

The authors declare that they have no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (MP4 55911 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wu, H., Huang, K., Deng, Y. et al. Exploring the design space of eyes-free target acquisition in virtual environments. Virtual Reality 26, 513–524 (2022). https://doi.org/10.1007/s10055-021-00591-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10055-021-00591-6

Keywords

Navigation