skip to main content
10.1145/2168556.2168577acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Designing gaze-based user interfaces for steering in virtual environments

Published:28 March 2012Publication History

ABSTRACT

Since eye gaze may serve as an efficient and natural input for steering in virtual 3D scenes, we investigate the design of eye gaze steering user interfaces (UIs) in this paper. We discuss design considerations and propose design alternatives based on two selected steering approaches differing in input condition (discrete vs. continuous) and velocity selection (constant vs. gradient-based). The proposed UIs have been iteratively advanced based on two user studies with twelve participants each. In particular, the combination of continuous and gradient-based input shows a high potential, because it allows for gradually changing the moving speed and direction depending on a user's point-of-regard. This has the advantage of reducing overshooting problems and dwell-time activations. We also investigate discrete constant input for which virtual buttons are toggled using gaze dwelling. As an alternative, we propose the Sticky Gaze Pointer as a more flexible way of discrete input.

Skip Supplemental Material Section

Supplemental Material

p131-stellmach.mp4

mp4

25 MB

References

  1. Adams, N., Witkowski, M., and Spence, R. 2008. The inspection of very large images by eye-gaze control. In Proc. AVI'08, ACM, 111--118. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Ashtiani, B., and MacKenzie, I. S. 2010. BlinkWrite2: an improved text entry method using eye blinks. In Proc. ETRA'10, ACM, 339--345. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Bowman, D. A., Koller, D., and Hodges, L. F. 1997. Travel in immersive virtual environments: An evaluation of viewpoint motion control techniques. In Proc. VRAIS '97, IEEE Computer Society, 45. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Castellina, E., and Corno, F. 2008. Multimodal gaze interaction in 3D virtual environments. In Proc. COGAIN '08, 33--37.Google ScholarGoogle Scholar
  5. Cournia, N., Smith, J. D., and Duchowski, A. T. 2003. Gazevs. handbased pointing in virtual environments. In Proc. CHI '03, ACM, 772--773. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Hansen, D. W., Skovsgaard, H. H. T., Hansen, J. P., and Møllenbach, E. 2008. Noise tolerant selection by gaze-controlled pan and zoom in 3D. In Proc. ETRA'08, ACM, 205--212. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Istance, H., Bates, R., Hyrskykari, A., and Vickers, S. 2008. Snap clutch, a moded approach to solving the midas touch problem. In Proc. ETRA'08, ACM, 221--228. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Istance, H., Vickers, S., and Hyrskykari, A. 2009. Gaze-based interaction with massively multiplayer on-line games. In Proc. CHI'09, ACM, 4381--4386. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Istance, H., Hyrskykari, A., Immonen, L., Mansikkamaa, S., and Vickers, S. 2010. Designing gaze gestures for gaming: an investigation of performance. In Proc. ETRA'10, ACM, 323--330. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Jacob, R. J. K. 1990. What you look at is what you get: eye movement-based interaction techniques. In Proc. CHI '90, ACM, 11--18. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Jacob, R. 1993. Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces. Ablex Publishing Co., ch. Advances in Human-Computer Interaction, 151--190.Google ScholarGoogle Scholar
  12. Kumar, M., Paepcke, A., and Winograd, T. 2007. Eyepoint: practical pointing and selection using gaze and keyboard. In Proc. CHI'07, ACM, 421--430. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Majaranta, P., and Räihä, K.-J. 2002. Twenty years of eye typing: systems and design issues. In Proc. ETRA'02, ACM, 15--22. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Mine, M. 1995. Virtual environment interaction techniques. Tech. rep., Technical Report, UNC Chapel Hill CS Dept. Google ScholarGoogle Scholar
  15. Nacke, L., Stellmach, S., Sasse, D., and Lindley, C. A. 2009. Gameplay experience in a gaze interaction game. In Proc. COGAIN'09, COGAIN Association, 49--54.Google ScholarGoogle Scholar
  16. Smith, J. D., and Graham, T. C. N. 2006. Use of eye movements for video game control. In Proc. ACE'06, ACM, 20. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Stellmach, S., and Dachselt, R. 2012. Investigating gaze-supported multimodal pan and zoom. In Proc. ETRA'12, ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Stellmach, S., and Dachselt, R. 2012. Look & touch: Gaze-supported target acquisition. In Proc. CHI'12, ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Stellmach, S., Stober, S., Nürnberger, A., and Dachselt, R. 2011. Designing gaze-supported multimodal interactions for the exploration of large image collections. In Proc. NGCA'11, ACM, 1--8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Stiefelhagen, R., and Yang, J. 1997. Gaze tracking for multi-modal human-computer interaction. Acoustics, Speech, and Signal Processing, IEEE International Conference on 4, 2617. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Tall, M., Alapetite, A., San Agustin, J., Skovsgaard, H. H., Hansen, J. P., Hansen, D. W., and Møllenbach, E. 2009. Gaze-controlled driving. In Proc. CHI EA'09, ACM, 4387--4392. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Tanriverdi, V., and Jacob, R. J. K. 2000. Interacting with eye movements in virtual environments. In Proc. CHI '00, ACM, 265--272. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Vickers, S., Istance, H., Hyrskykari, A., Ali, N., and Bates, R. 2008. Keeping an eye on the game: eye gaze interaction with massively multiplayer online games and virtual communities for motor impaired users. In Proc. ICDVRAT'08, 159--166.Google ScholarGoogle Scholar
  24. Špakov, O., and Miniotas, D. 2005. Gaze-based selection of standard-size menu items. In Proc. ICMI '05, ACM, 124--128. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Ware, C., and Mikaelian, H. H. 1987. An evaluation of an eye tracker as a device for computer input. In Proc. SIGCHI+GI '87, 183--188. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Zhai, S., Morimoto, C., and Ihde, S. 1999. Manual and gaze input cascaded (MAGIC) pointing. In Proc. CHI '99, ACM, 246--253. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Zhu, D., Gedeon, T., and Taylor, K. 2011. Moving to the centre: A gaze-driven remote camera control for teleoperation. Interact. Comput. 23 (January), 85--95. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Designing gaze-based user interfaces for steering in virtual environments

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ETRA '12: Proceedings of the Symposium on Eye Tracking Research and Applications
      March 2012
      420 pages
      ISBN:9781450312219
      DOI:10.1145/2168556

      Copyright © 2012 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 28 March 2012

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate69of137submissions,50%

      Upcoming Conference

      ETRA '24
      The 2024 Symposium on Eye Tracking Research and Applications
      June 4 - 7, 2024
      Glasgow , United Kingdom

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader