skip to main content
10.1145/3379156.3391829acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Eye vs. Head: Comparing Gaze Methods for Interaction in Augmented Reality

Published:02 June 2020Publication History

ABSTRACT

Visualization in virtual 3D environments can provide a natural way for users to explore data. Often, arm and short head movements are required for interaction in augmented reality, which can be tiring and strenuous though. In an effort toward more user-friendly interaction, we developed a prototype that allows users to manipulate virtual objects using a combination of eye gaze and an external clicker device. Using this prototype, we performed a user study comparing four different input methods of which head gaze plus clicker was preferred by most participants.

References

  1. Jonas Blattgerste, Patrick Renner, and Thies Pfeiffer. 2018. Advantages of Eye-gaze over Head-gaze-based Selection in Virtual and Augmented Reality Under Varying Field of Views. In Proceedings of the Workshop on Communication by Gaze Interaction(COGAIN ’18). ACM, 1–9.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Heiko Drewes. 2010. Eye Gaze Tracking for Human Computer Interaction. Ph.D. Dissertation. Ludwig-Maximilians-Universität München.Google ScholarGoogle Scholar
  3. Tim Dwyer, Kim Marriott, Tobias Isenberg, Karsten Klein, Nathalie Riche, Falk Schreiber, Wolfgang Stuerzlinger, and Bruce H. Thomas. 2018. Immersive Analytics: An Introduction. Springer International Publishing, Cham, 1–23.Google ScholarGoogle Scholar
  4. Anja Groß, Michael Becher, Guido Reina, Thomas Ertl, and Michael Krone. 2019. A User Interaction Design for Object Manipulation via Eye Tracking in Virtual Reality. Workshop on Novel Input Devices and Interaction Techniques (NIDIT) (2019).Google ScholarGoogle Scholar
  5. Richard M Heiberger and Naomi B Robbins. 2014. Design of diverging stacked bar charts for Likert scales and other applications. Journal of Statistical Software 57, 5 (2014), 1–32.Google ScholarGoogle ScholarCross RefCross Ref
  6. Robert J. K. Jacob. 1990. What You Look at is What You Get: Eye Movement-based Interaction Techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’90). ACM, 11–18.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Jae-Young Lee, Hyung-Min Park, Seok-Han Lee, Tae-Eun Kim, and Jong-Soo Choi. 2011. Design and Implementation of an Augmented Reality System Using Gaze Interaction. In 2011 International Conference on Information Science and Applications. 1–8.Google ScholarGoogle Scholar
  8. Ken Pfeuffer, Jason Alexander, Ming Ki Chong, and Hans Gellersen. 2014. Gaze-touch: Combining Gaze with Multi-touch for Interaction on the Same Surface. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology(UIST ’14). ACM, 509–518.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Yuan Yuan Qian and Robert J. Teather. 2017. The Eyes Don’t Have It: An Empirical Comparison of Head-based and Eye-based Selection in Virtual Reality. In Proceedings of the 5th Symposium on Spatial User Interaction(SUI ’17). ACM, 91–98.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of Eye Gaze Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’00). ACM, 281–288.Google ScholarGoogle Scholar
  11. Mélodie Vidal, Ken Pfeuffer, Andreas Bulling, and Hans W. Gellersen. 2013. Pursuits: Eye-based Interaction with Moving Targets. In CHI ’13 Extended Abstracts on Human Factors in Computing Systems(CHI EA ’13). ACM, 3147–3150.Google ScholarGoogle Scholar
  12. Yan Wang, Guangtao Zhai, Sichao Chen, Xiongkuo Min, Zhongpai Gao, and Xuefei Song. 2019. Assessment of eye fatigue caused by head-mounted displays using eye-tracking. BioMedical Engineering OnLine 18, 111 (2019).Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Conferences
    ETRA '20 Short Papers: ACM Symposium on Eye Tracking Research and Applications
    June 2020
    305 pages
    ISBN:9781450371346
    DOI:10.1145/3379156

    Copyright © 2020 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 2 June 2020

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • short-paper
    • Research
    • Refereed limited

    Acceptance Rates

    Overall Acceptance Rate69of137submissions,50%

    Upcoming Conference

    ETRA '24
    The 2024 Symposium on Eye Tracking Research and Applications
    June 4 - 7, 2024
    Glasgow , United Kingdom

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format