ABSTRACT
Visualization in virtual 3D environments can provide a natural way for users to explore data. Often, arm and short head movements are required for interaction in augmented reality, which can be tiring and strenuous though. In an effort toward more user-friendly interaction, we developed a prototype that allows users to manipulate virtual objects using a combination of eye gaze and an external clicker device. Using this prototype, we performed a user study comparing four different input methods of which head gaze plus clicker was preferred by most participants.
- Jonas Blattgerste, Patrick Renner, and Thies Pfeiffer. 2018. Advantages of Eye-gaze over Head-gaze-based Selection in Virtual and Augmented Reality Under Varying Field of Views. In Proceedings of the Workshop on Communication by Gaze Interaction(COGAIN ’18). ACM, 1–9.Google ScholarDigital Library
- Heiko Drewes. 2010. Eye Gaze Tracking for Human Computer Interaction. Ph.D. Dissertation. Ludwig-Maximilians-Universität München.Google Scholar
- Tim Dwyer, Kim Marriott, Tobias Isenberg, Karsten Klein, Nathalie Riche, Falk Schreiber, Wolfgang Stuerzlinger, and Bruce H. Thomas. 2018. Immersive Analytics: An Introduction. Springer International Publishing, Cham, 1–23.Google Scholar
- Anja Groß, Michael Becher, Guido Reina, Thomas Ertl, and Michael Krone. 2019. A User Interaction Design for Object Manipulation via Eye Tracking in Virtual Reality. Workshop on Novel Input Devices and Interaction Techniques (NIDIT) (2019).Google Scholar
- Richard M Heiberger and Naomi B Robbins. 2014. Design of diverging stacked bar charts for Likert scales and other applications. Journal of Statistical Software 57, 5 (2014), 1–32.Google ScholarCross Ref
- Robert J. K. Jacob. 1990. What You Look at is What You Get: Eye Movement-based Interaction Techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’90). ACM, 11–18.Google ScholarDigital Library
- Jae-Young Lee, Hyung-Min Park, Seok-Han Lee, Tae-Eun Kim, and Jong-Soo Choi. 2011. Design and Implementation of an Augmented Reality System Using Gaze Interaction. In 2011 International Conference on Information Science and Applications. 1–8.Google Scholar
- Ken Pfeuffer, Jason Alexander, Ming Ki Chong, and Hans Gellersen. 2014. Gaze-touch: Combining Gaze with Multi-touch for Interaction on the Same Surface. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology(UIST ’14). ACM, 509–518.Google ScholarDigital Library
- Yuan Yuan Qian and Robert J. Teather. 2017. The Eyes Don’t Have It: An Empirical Comparison of Head-based and Eye-based Selection in Virtual Reality. In Proceedings of the 5th Symposium on Spatial User Interaction(SUI ’17). ACM, 91–98.Google ScholarDigital Library
- Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of Eye Gaze Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’00). ACM, 281–288.Google Scholar
- Mélodie Vidal, Ken Pfeuffer, Andreas Bulling, and Hans W. Gellersen. 2013. Pursuits: Eye-based Interaction with Moving Targets. In CHI ’13 Extended Abstracts on Human Factors in Computing Systems(CHI EA ’13). ACM, 3147–3150.Google Scholar
- Yan Wang, Guangtao Zhai, Sichao Chen, Xiongkuo Min, Zhongpai Gao, and Xuefei Song. 2019. Assessment of eye fatigue caused by head-mounted displays using eye-tracking. BioMedical Engineering OnLine 18, 111 (2019).Google Scholar
Recommendations
The Eye in Extended Reality: A Survey on Gaze Interaction and Eye Tracking in Head-worn Extended Reality
With innovations in the field of gaze and eye tracking, a new concentration of research in the area of gaze-tracked systems and user interfaces has formed in the field of Extended Reality (XR). Eye trackers are being used to explore novel forms of spatial ...
Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection
UIST '19: Proceedings of the 32nd Annual ACM Symposium on User Interface Software and TechnologyEye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and ...
Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views
COGAIN '18: Proceedings of the Workshop on Communication by Gaze InteractionThe current best practice for hands-free selection using Virtual and Augmented Reality (VR/AR) head-mounted displays is to use head-gaze for aiming and dwell-time or clicking for triggering the selection. There is an observable trend for new VR and AR ...
Comments