ABSTRACT
Large screens are populating a variety of settings motivating research on appropriate interaction techniques. While gesture is popularized by depth cameras we contribute with a comparison study showing how eye pointing is a valuable substitute to gesture pointing in dragging tasks. We compare eye pointing combined with gesture selection to gesture pointing and selection. Results clearly show that eye pointing combined with a selection gesture allows more accurate and faster dragging.
- Bieg, H.-J., Chuang, L. L., Fleming, R. W., Reiterer, H., and Bülthoff, H. H. Eye and pointer coordination in search and selection tasks. In Proc. 2010 Symp. Eye-Tracking Research & Applications (ETRA '10), ACM (New York, New York, USA, 2010), 89--92. Google ScholarDigital Library
- Fono, D., and Vertegaal, R. Eyewindows: evaluation of eye-controlled zooming windows for focus selection. In Proc. CHI, ACM Press (2005), 151--160. Google ScholarDigital Library
- Jacob, R. J. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans. Information Systems (TOIS) 9, 2 (1991), 152--169. Google ScholarDigital Library
- Jota, R., Nacenta, M. A., Jorge, J. A., Carpendale, S., and Greenberg, S. A comparison of ray pointing techniques for very large displays. In Proc. Graphics Interface, GI '10, Canadian Inform. Proc. Soc. (2010), 269--276. Google ScholarDigital Library
- Vogel, D., and Balakrishnan, R. Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users. In Proc. Symp. User Interface Software and Technology (UIST '04), ACM (2004), 137--146. Google ScholarDigital Library
- Vogel, D., and Balakrishnan, R. Distant freehand pointing and clicking on very large, high resolution displays. In Proc. Symp. User Interface Software and Technology (UIST '05), ACM (2005), 33--42. Google ScholarDigital Library
- Zhang, X., and MacKenzie, I. S. Evaluating eye tracking with ISO 9241-Part 9. In Human-Computer Interaction. HCI Intelligent Multimodal Interaction Environments. Springer, 2007, 779--788. Google ScholarDigital Library
Index Terms
- Comparing eye and gesture pointing to drag items on large screens
Recommendations
Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality
Humans perform gaze shifts naturally through a combination of eye, head and body movements. Although gaze has been long studied as input modality for interaction, this has previously ignored the coordination of the eyes, head and body. This article ...
User Eye Fatigue Detection via Eye Movement Behavior
CHI EA '15: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing SystemsIn this study we propose and evaluate a novel approach that allows detection of physical eye fatigue. The proposed approach is based on the analysis of the recorded eye movements via what is called behavioral scores. These easy-to-compute scores can be ...
Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection
UIST '19: Proceedings of the 32nd Annual ACM Symposium on User Interface Software and TechnologyEye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and ...
Comments