skip to main content
10.1145/3317956.3318153acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

A comparative study of eye tracking and hand controller for aiming tasks in virtual reality

Published: 25 June 2019 Publication History

Abstract

Aiming is key for virtual reality (VR) interaction, and it is often done using VR controllers. Recent eye-tracking integrations in commercial VR head-mounted displays (HMDs) call for further research on usability and performance aspects to better determine possibilities and limitations. This paper presents a user study exploring gaze aiming in VR compared to a traditional controller in an "aim and shoot" task. Different speeds of targets and trajectories were studied. Qualitative data was gathered using the system usability scale (SUS) and cognitive load (NASA TLX) questionnaires. Results show a lower perceived cognitive load using gaze aiming and on par usability scale. Gaze aiming produced on par task duration but lower accuracy on most conditions. Lastly, the trajectory of the target significantly affected the orientation of the HMD in relation to the target's location. The results show potential using gaze aiming in VR and motivate further research.

References

[1]
Aaron Bangor, Philip T Kortum, and James T Miller. 2008. An empirical evaluation of the system usability scale. Intl. Journal of Human-Computer Interaction 24, 6 (2008), 574--594.
[2]
Jonas Blattgerste, Patrick Renner, and Thies Pfeiffer. 2018. Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views. In Proceedings of the Workshop on Communication by Gaze Interaction. ACM, 1.
[3]
John Brooke et al. 1996. SUS-A quick and dirty usability scale. Usability evaluation in industry 189, 194 (1996), 4--7.
[4]
Mungyeong Choe, Yeongcheol Choi, Jaehyun Park, and Hyun K. Kim. 2018. Comparison of Gaze Cursor Input Methods for Virtual Reality Devices. International Journal of Human-Computer Interaction 0, 0 (2018), 1--10.
[5]
Han Collewijn and Ernst P Tamminga. 1984. Human smooth and saccadic eye movements during voluntary pursuit of different target motions on different backgrounds. The Journal of physiology 351, 1 (1984), 217--250.
[6]
Nathan Cournia, John D. Smith, and Andrew T. Duchowski. 2003. Gaze- vs. Hand-based Pointing in Virtual Environments. In CHI '03 Extended Abstracts on Human Factors in Computing Systems (CHI EA '03). ACM, New York, NY, USA, 772--773.
[7]
John Paulin Hansen, Vijay Rajanna, I. Scott MacKenzie, and Per Bækgaard. 2018. A Fitts' Law Study of Click and Dwell Interaction by Gaze, Head and Mouse with a Head-mounted Display. In Proceedings of the Workshop on Communication by Gaze Interaction (COGAIN '18). ACM, New York, NY, USA, Article 7, 5 pages.
[8]
Sandra G Hart. 2006. NASA-task load index (NASA-TLX); 20 years later. In Proceedings of the human factors and ergonomics society annual meeting, Vol. 50. Sage Publications Sage CA: Los Angeles, CA, 904--908.
[9]
Sandra G Hart and Lowell E Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. In Advances in psychology. Vol. 52. Elsevier, 139--183.
[10]
Felix Hülsmann, Timo Dankert, and Thies Pfeiffer. 2011. Comparing gaze-based and manual interaction in a fast-paced gaming task in virtual reality. In Proceedings of the Workshop Virtuelle & Erweiterte Realität 2011.
[11]
John D Hunter. 2007. Matplotlib: A 2D graphics environment. Computing in science & engineering 9, 3 (2007), 90--95.
[12]
Poika Isokoski, Markus Joos, Oleg Spakov, and Benoit Martin. 2009. Gaze controlled games. Universal Access in the Information Society 8, 4 (2009), 323.
[13]
Robert J. K. Jacob. 1990. What You Look at is What You Get: Eye Movement-based Interaction Techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '90). ACM, New York, NY, USA, 11--18.
[14]
Eric Jones, Travis Oliphant, Pearu Peterson, et al. 2001. SciPy: Open source scientific tools for Python. http://www.scipy.org/ {Online; accessed <today>}.
[15]
Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A Lee, and Mark Billinghurst. 2018. Pinpointing: Precise Head-and Eye-Based Target Selection for Augmented Reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 81.
[16]
I Scott MacKenzie. 1992. Fitts' law as a research and design tool in human-computer interaction. Human-computer interaction 7, 1 (1992), 91--139.
[17]
Wes McKinney et al. 2010. Data structures for statistical computing in python. In Proceedings of the 9th Python in Science Conference, Vol. 445. Austin, TX, 51--56.
[18]
Craig H Meyer, Adrian G Lasker, and David A Robinson. 1985. The upper limit of human smooth pursuit velocity. Vision research 25, 4 (1985), 561--563.
[19]
Microsoft. 2018. Remix 3D. Retrieved Dec 26, 2018 from https://www.remix3d.com
[20]
Yun Suen Pai, Tilman Dingler, and Kai Kunze. 2018. Assessing hands-free interactions for VR using eye gaze and electromyography. Virtual Reality (20 Nov 2018).
[21]
Anjul Patney, Marco Salvi, Joohwan Kim, Anton Kaplanyan, Chris Wyman, Nir Benty, David Luebke, and Aaron Lefohn. 2016. Towards Foveated Rendering for Gaze-tracked Virtual Reality. ACM Trans. Graph. 35, 6, Article 179 (Nov. 2016), 12 pages.
[22]
Yuan Yuan Qian and Robert J Teather. 2017. The eyes don't have it: an empirical comparison of head-based and eye-based selection in virtual reality. In Proceedings of the 5th Symposium on Spatial User Interaction. ACM, 91--98.
[23]
Veronica Sundstedt. 2012. Gazing at games: An introduction to eye tracking control. Synthesis Lectures on Computer Graphics and Animation 5, 1 (2012), 1--113.
[24]
Vildan Tanriverdi and Robert J. K. Jacob. 2000. Interacting with Eye Movements in Virtual Environments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '00). ACM, New York, NY, USA, 265--272.
[25]
Tobii. 2018. Tobii Pro VR Integration. Retrieved Dec 26, 2018 from https://www.tobiipro.com/products/tobii-pro-vr-integration/
[26]
Unity. 2018a. Unity 3D. Retrieved Dec 26, 2018 from https://unity3d.com
[27]
Unity. 2018b. Unity Asset Store. Retrieved Dec 26, 2018 from https://assetstore.unity.com
[28]
Eduardo Velloso and Marcus Carter. 2016. The emergence of eyeplay: a survey of eye interaction in games. In Proceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play. ACM, 171--185.

Cited By

View all
  • (2025)Gaze Inputs for Targeting: The Eyes Have It, Not With a CursorInternational Journal of Human–Computer Interaction10.1080/10447318.2025.2453966(1-19)Online publication date: 5-Feb-2025
  • (2024)The Trail Making Test in Virtual Reality (TMT-VR): The Effects of Interaction Modes and Gaming Skills on Cognitive Performance of Young AdultsApplied Sciences10.3390/app14211001014:21(10010)Online publication date: 2-Nov-2024
  • (2024)RPG: Rotation Technique in VR Locomotion using Peripheral GazeProceedings of the ACM on Human-Computer Interaction10.1145/36556098:ETRA(1-19)Online publication date: 28-May-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
June 2019
623 pages
ISBN:9781450367097
DOI:10.1145/3314111
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 25 June 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. VR
  2. aiming
  3. controller
  4. gaze interaction
  5. performance
  6. usability

Qualifiers

  • Research-article

Conference

ETRA '19

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)214
  • Downloads (Last 6 weeks)15
Reflects downloads up to 12 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Gaze Inputs for Targeting: The Eyes Have It, Not With a CursorInternational Journal of Human–Computer Interaction10.1080/10447318.2025.2453966(1-19)Online publication date: 5-Feb-2025
  • (2024)The Trail Making Test in Virtual Reality (TMT-VR): The Effects of Interaction Modes and Gaming Skills on Cognitive Performance of Young AdultsApplied Sciences10.3390/app14211001014:21(10010)Online publication date: 2-Nov-2024
  • (2024)RPG: Rotation Technique in VR Locomotion using Peripheral GazeProceedings of the ACM on Human-Computer Interaction10.1145/36556098:ETRA(1-19)Online publication date: 28-May-2024
  • (2024)The Effect of Degraded Eye Tracking Accuracy on Interactions in VRProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656369(1-7)Online publication date: 4-Jun-2024
  • (2024)Keep Your Eyes on the Target: Enhancing Immersion and Usability by Designing Natural Object Throwing with Gaze-based TargetingProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653338(1-7)Online publication date: 4-Jun-2024
  • (2024)Controller Evaluation for Earthwork Teleoperation and Training in Virtual Reality2024 16th International Conference on Human System Interaction (HSI)10.1109/HSI61632.2024.10613582(1-6)Online publication date: 8-Jul-2024
  • (2024)Evaluating the performance of gaze interaction for map target selectionCartography and Geographic Information Science10.1080/15230406.2024.233533152:1(82-102)Online publication date: 9-Apr-2024
  • (2024)Enhancing Fixation and Pursuit: Optimizing Field of View and Number of Targets for Selection Performance in Virtual RealityInternational Journal of Human–Computer Interaction10.1080/10447318.2024.231388841:2(1221-1233)Online publication date: 15-Feb-2024
  • (2024)Understanding user experience, task performance, and task interdependence in symmetric and asymmetric VR collaborationsVirtual Reality10.1007/s10055-024-01072-229:1Online publication date: 5-Dec-2024
  • (2024)Eye-tracking on virtual reality: a surveyVirtual Reality10.1007/s10055-023-00903-y28:1Online publication date: 5-Feb-2024
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media