ABSTRACT
Eye trackers have been used as pointing devices for a number of years. Due to inherent limitations in the accuracy of eye gaze, however, interaction is limited to objects spanning at least one degree of visual angle. Consequently, targets in gaze-based interfaces have sizes and layouts quite distant from "natural settings". To accommodate accuracy constraints, we developed a multimodal pointing technique combining eye gaze and speech inputs. The technique was tested in a user study on pointing at multiple targets. Results suggest that in terms of a footprint-accuracy tradeoff, pointing performance is best (~93%) for targets subtending 0.85 degrees with 0.3-degree gaps between them. User performance is thus shown to approach the limit of practical pointing. Effectively, developing a user interface that supports hands-free interaction and has a design similar to today's common interfaces is feasible.
- Jacob, R. J. K. 1995. Eye Tracking in Advanced Interface Design. In W. Barfield & T. A. Furness (Eds.), Virtual Environments and Advanced Interface Design. New York: Oxford University Press, 258--288. Google ScholarDigital Library
- Koons, D. B., Sparrell, C. J., and Thorisson, K. R. 1993. Integrating Simultaneous Input from Speech, Gaze, and Hand Gestures. In M. Maybury (Ed.), Intelligent Multimedia Interfaces. Menlo Park, CA: MIT Press, 257--276. Google ScholarDigital Library
- Oviatt, S. 1999. Mutual Disambiguation of Recognition Errors in a Multimodal Architecture. In Proceedings of ACM SIGCHI 99, New York: ACM Press, 576--583. Google ScholarDigital Library
- Tanaka, K. 1999. A Robust Selection System Using Real-Time Multimodal User-Agent Interactions. In Proceedings of ACM IUI 99, New York: ACM Press, 105--108. Google ScholarDigital Library
- Ware, C. and Mikaelian, H. H. 1987. An Evaluation of an Eye Tracker as a Device for Computer Input. In Proceedings of ACM SIGCHI+GI 87, New York: ACM Press, 183--188. Google ScholarDigital Library
- Zhai, S., Morimoto, C., and Ihde, S. 1999. Manual and Gaze Input Cascaded (MAGIC) Pointing. In Proceedings of ACM SIGCHI 99, New York: ACM Press, 246--253. Google ScholarDigital Library
- Zhang, Q., Imamiya, A., Go, K., and Mao, X. 2004. Resolving Ambiguities of a Gaze and Speech Interface. In Proceedings of ETRA 2004, New York: ACM Press, 85--92. Google ScholarDigital Library
Index Terms
- Speech-augmented eye gaze interaction with small closely spaced targets
Recommendations
Gaze+Gesture: Expressive, Precise and Targeted Free-Space Interactions
ICMI '15: Proceedings of the 2015 ACM on International Conference on Multimodal InteractionHumans rely on eye gaze and hand manipulations extensively in their everyday activities. Most often, users gaze at an object to perceive it and then use their hands to manipulate it. We propose applying a multimodal, gaze plus free-space gesture ...
Evaluation of eye gaze interaction
CHI '00: Proceedings of the SIGCHI conference on Human Factors in Computing SystemsEye gaze interaction can provide a convenient and natural addition to user-computer dialogues. We have previously reported on our interaction techniques using eye gaze [10]. While our techniques seemed useful in demonstration, we now investigate their ...
Eye gaze interaction with expanding targets
CHI EA '04: CHI '04 Extended Abstracts on Human Factors in Computing SystemsRecent evidence on the performance benefits of expanding targets during manual pointing raises a provocative question: Can a similar effect be expected for eye gaze interaction? We present two experiments to examine the benefits of target expansion ...
Comments