ABSTRACT
Since eye gaze may serve as an efficient and natural input for steering in virtual 3D scenes, we investigate the design of eye gaze steering user interfaces (UIs) in this paper. We discuss design considerations and propose design alternatives based on two selected steering approaches differing in input condition (discrete vs. continuous) and velocity selection (constant vs. gradient-based). The proposed UIs have been iteratively advanced based on two user studies with twelve participants each. In particular, the combination of continuous and gradient-based input shows a high potential, because it allows for gradually changing the moving speed and direction depending on a user's point-of-regard. This has the advantage of reducing overshooting problems and dwell-time activations. We also investigate discrete constant input for which virtual buttons are toggled using gaze dwelling. As an alternative, we propose the Sticky Gaze Pointer as a more flexible way of discrete input.
Supplemental Material
- Adams, N., Witkowski, M., and Spence, R. 2008. The inspection of very large images by eye-gaze control. In Proc. AVI'08, ACM, 111--118. Google ScholarDigital Library
- Ashtiani, B., and MacKenzie, I. S. 2010. BlinkWrite2: an improved text entry method using eye blinks. In Proc. ETRA'10, ACM, 339--345. Google ScholarDigital Library
- Bowman, D. A., Koller, D., and Hodges, L. F. 1997. Travel in immersive virtual environments: An evaluation of viewpoint motion control techniques. In Proc. VRAIS '97, IEEE Computer Society, 45. Google ScholarDigital Library
- Castellina, E., and Corno, F. 2008. Multimodal gaze interaction in 3D virtual environments. In Proc. COGAIN '08, 33--37.Google Scholar
- Cournia, N., Smith, J. D., and Duchowski, A. T. 2003. Gazevs. handbased pointing in virtual environments. In Proc. CHI '03, ACM, 772--773. Google ScholarDigital Library
- Hansen, D. W., Skovsgaard, H. H. T., Hansen, J. P., and Møllenbach, E. 2008. Noise tolerant selection by gaze-controlled pan and zoom in 3D. In Proc. ETRA'08, ACM, 205--212. Google ScholarDigital Library
- Istance, H., Bates, R., Hyrskykari, A., and Vickers, S. 2008. Snap clutch, a moded approach to solving the midas touch problem. In Proc. ETRA'08, ACM, 221--228. Google ScholarDigital Library
- Istance, H., Vickers, S., and Hyrskykari, A. 2009. Gaze-based interaction with massively multiplayer on-line games. In Proc. CHI'09, ACM, 4381--4386. Google ScholarDigital Library
- Istance, H., Hyrskykari, A., Immonen, L., Mansikkamaa, S., and Vickers, S. 2010. Designing gaze gestures for gaming: an investigation of performance. In Proc. ETRA'10, ACM, 323--330. Google ScholarDigital Library
- Jacob, R. J. K. 1990. What you look at is what you get: eye movement-based interaction techniques. In Proc. CHI '90, ACM, 11--18. Google ScholarDigital Library
- Jacob, R. 1993. Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces. Ablex Publishing Co., ch. Advances in Human-Computer Interaction, 151--190.Google Scholar
- Kumar, M., Paepcke, A., and Winograd, T. 2007. Eyepoint: practical pointing and selection using gaze and keyboard. In Proc. CHI'07, ACM, 421--430. Google ScholarDigital Library
- Majaranta, P., and Räihä, K.-J. 2002. Twenty years of eye typing: systems and design issues. In Proc. ETRA'02, ACM, 15--22. Google ScholarDigital Library
- Mine, M. 1995. Virtual environment interaction techniques. Tech. rep., Technical Report, UNC Chapel Hill CS Dept. Google Scholar
- Nacke, L., Stellmach, S., Sasse, D., and Lindley, C. A. 2009. Gameplay experience in a gaze interaction game. In Proc. COGAIN'09, COGAIN Association, 49--54.Google Scholar
- Smith, J. D., and Graham, T. C. N. 2006. Use of eye movements for video game control. In Proc. ACE'06, ACM, 20. Google ScholarDigital Library
- Stellmach, S., and Dachselt, R. 2012. Investigating gaze-supported multimodal pan and zoom. In Proc. ETRA'12, ACM. Google ScholarDigital Library
- Stellmach, S., and Dachselt, R. 2012. Look & touch: Gaze-supported target acquisition. In Proc. CHI'12, ACM. Google ScholarDigital Library
- Stellmach, S., Stober, S., Nürnberger, A., and Dachselt, R. 2011. Designing gaze-supported multimodal interactions for the exploration of large image collections. In Proc. NGCA'11, ACM, 1--8. Google ScholarDigital Library
- Stiefelhagen, R., and Yang, J. 1997. Gaze tracking for multi-modal human-computer interaction. Acoustics, Speech, and Signal Processing, IEEE International Conference on 4, 2617. Google ScholarDigital Library
- Tall, M., Alapetite, A., San Agustin, J., Skovsgaard, H. H., Hansen, J. P., Hansen, D. W., and Møllenbach, E. 2009. Gaze-controlled driving. In Proc. CHI EA'09, ACM, 4387--4392. Google ScholarDigital Library
- Tanriverdi, V., and Jacob, R. J. K. 2000. Interacting with eye movements in virtual environments. In Proc. CHI '00, ACM, 265--272. Google ScholarDigital Library
- Vickers, S., Istance, H., Hyrskykari, A., Ali, N., and Bates, R. 2008. Keeping an eye on the game: eye gaze interaction with massively multiplayer online games and virtual communities for motor impaired users. In Proc. ICDVRAT'08, 159--166.Google Scholar
- Špakov, O., and Miniotas, D. 2005. Gaze-based selection of standard-size menu items. In Proc. ICMI '05, ACM, 124--128. Google ScholarDigital Library
- Ware, C., and Mikaelian, H. H. 1987. An evaluation of an eye tracker as a device for computer input. In Proc. SIGCHI+GI '87, 183--188. Google ScholarDigital Library
- Zhai, S., Morimoto, C., and Ihde, S. 1999. Manual and gaze input cascaded (MAGIC) pointing. In Proc. CHI '99, ACM, 246--253. Google ScholarDigital Library
- Zhu, D., Gedeon, T., and Taylor, K. 2011. Moving to the centre: A gaze-driven remote camera control for teleoperation. Interact. Comput. 23 (January), 85--95. Google ScholarDigital Library
Index Terms
- Designing gaze-based user interfaces for steering in virtual environments
Recommendations
Visual Gesture Interfaces for Virtual Environments
AUIC '00: Proceedings of the First Australasian User Interface ConferenceVirtual environments provide a whole new way of viewing and manipulating 3D data. Current technology moves the images out of desktop monitors and into the space immediately surrounding the user. Users can literally put their hands on the virtual ...
Glasses with haptic feedback of gaze gestures
CHI EA '14: CHI '14 Extended Abstracts on Human Factors in Computing SystemsWe introduce eyeglasses that present haptic feedback when using gaze gestures for input. The glasses utilize vibrotactile actuators to provide gentle stimulation to three locations on the user's head. We describe two initial user studies that were ...
Intelligent gaze-added interfaces
CHI '00: Proceedings of the SIGCHI conference on Human Factors in Computing SystemsWe discuss a novel type of interface, the intelligent gaze-added interface, and describe the design and evaluation of a sample gaze-added operating-system interface. Gaze-added interfaces, like current gaze-based systems, allow users to execute commands ...
Comments