skip to main content
10.1145/1054972.1054994acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

EyeWindows: evaluation of eye-controlled zooming windows for focus selection

Published: 02 April 2005 Publication History

Abstract

In this paper, we present an attentive windowing technique that uses eye tracking, rather than manual pointing, for focus window selection. We evaluated the performance of 4 focus selection techniques: eye tracking with key activation, eye tracking with automatic activation, mouse and hotkeys in a typing task with many open windows. We also evaluated a zooming windowing technique designed specifically for eye-based control, comparing its performance to that of a stan-dard tiled windowing environment. Results indicated that eye tracking with automatic activation was, on average, about twice as fast as mouse and hotkeys. Eye tracking with key activation was about 72% faster than manual conditions, and preferred by most participants. We believe eye input performed well because it allows manual input to be provided in parallel to focus selection tasks. Results also suggested that zooming windows outperform static tiled windows by about 30%. Furthermore, this performance gain scaled with the number of windows used. We conclude that eye-controlled zooming windows with key activation pro-vides an efficient and effective alternative to current focus window selection techniques.

References

[1]
Apple Computers, Inc. Mac OS X Exposé. http://www.apple.com/macosx/features/expose/, 2003.]]
[2]
Baudisch, P., DeCarlo, D., Duchowski, A., and Geisler. W. Focusing on the Essential: Considering Attention in Display Design. In Special Issue on Attentive User Interfaces, Communications of ACM Vol. 46, No. 3, 2003, pp. 60--66.]]
[3]
Baudisch, P., Good, N., Belotti, V., and Schraedley, P. Keeping Things in Context: A Comparative Evaluation of Focus Plus Context Screens, Overviews, and Zooming. In Proceedings of CHI'02 Conference on Human Factors in Computing Systems. Minneapolis: ACM Press, 2002, pp. 259--266.]]
[4]
Bly, S. and Rosenberg, J.K. A Comparison of Tiled and Overlapping Windows. In Proceedings of CHI'86 Conference on Human Factors in Computing Systems, Boston: ACM Press, 1986, pp. 101--106.]]
[5]
Bolt, R. A. Gaze-Orchestrated Dynamic Windows. In Proceedings of the 8th Annual Conference on Computer Graphics and Interactive Techniques. Dallas: ACM Press, 1981, pp. 109--119.]]
[6]
Cadiz, J., Venolia, G., Jancke, G. Gupta, A. All Ways Aware: Designing and Deploying an Information Awareness Interface. In Proceedings of ACM CSCW'02 Conference on Computer Supported Cooperative Work. New Orleans: ACM Press, 2002, pp. 314--323.]]
[7]
Duchowski, A. Eye Tracking Methodology: Theory & Practice. London, UK: Springer Verlag, 2003.]]
[8]
Furnas, G.W. Generalized Fisheye Views. In Proceed-ings of CHI'86 Conference on Human Factors in Computing Systems. Boston: ACM Press, 1986, pp. 16--23.]]
[9]
Gutwin, C. Improving Focus Targeting in Interactive Fisheye Views. In Proceedings of CHI'02, Conference on Human Factors in Computing Systems. Minneapolis: ACM Press, 2002, pp. 267--274.]]
[10]
Horvitz, E., Jacobs, A., and Hovel, D. Attention-Sensitive Alerting. In Proceedings of UAI'99 Conference on Uncertainty and Artificial Intelligence, 1999, pp. 305--313.]]
[11]
Jacob, R. The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look At is What You Get. In ACM Transactions on Information Systems, Vol. 9, No 3, 1991, pp. 152--169.]]
[12]
Jacob, R. What You Look At Is What You Get: Eye Movement-Based Interaction Techniques. In Proceedings of ACM CHI'90 Conference on Human Factors in Computing Systems. Seattle, ACM Press, 1990, pp. 11--18.]]
[13]
Kandogan, E. and Shneiderman, B. Elastic Windows: A Hierarchical Multi-window World-Wide Web Browser. In Proceedings of ACM UIST'97 Symposium on User Interface Software and Technology. Banff, Canada: ACM Press, 1997, pp. 169--177.]]
[14]
Kandogan, E. and Shneiderman, B. Elastic Windows: Evaluation of Multi-window Operations. In Proceedings of ACM CHI'97 Conference on Human Factors in Computing Systems. Atlanta: ACM Press, 1997, pp. 250--257.]]
[15]
Sarkar, M., and Brown, M. Graphical Fisheye Views of Graphs. In Proceedings of CHI'92 Conference on Human Factors in Computing Systems. Monterey: ACM Press, 1992, pp. 83--91.]]
[16]
Shell, J., Vertegaal, R, and Skaburskis, A. EyePliances: Attention-Seeking Devices that Respond to Visual Attention. In Extended Abstracts of ACM CHI '03 Conference on Human Factors in Computing Systems, 2003, pp. 770--771.]]
[17]
Shell, J., Selker, T., and Vertegaal, R. Interacting with Groups of Computers. In Special Issue on Attentive User Interfaces, Communications of ACM Vol. 46 No. 3, 2003, pp. 40--46.]]
[18]
Sibert L., and Jacob, J. Evaluation of Eye Gaze Interaction. In Proceedings of CHI'00 Conference on Human Factors in Computing Systems. The Hague: ACM Press, 2000, pp. 281--288.]]
[19]
Smith, D., Irby, C., et al. Designing the Star User Interface. BYTE 7(4), 1982.]]
[20]
Starker, I., and Bolt, R. A Gaze-Responsive Self-Disclosing Display. In Proceedings of CHI'90 Conference on Human Factors in Computing Systems. Boston: ACM Press, 1990, pp. 3--9.]]
[21]
Tobii Systems. http://www.tobii.se, 2003.]]
[22]
Vertegaal, R. Attentive User Interfaces. Editorial, Special Issue on Attentive User Interfaces, Communications of ACM Vol. 46, No. 3, 2003, pp. 31--33.]]
[23]
Wang, J., Zhai, S. and Su, H. Chinese Input with Keyboard and Eye-Tracking - An Anatomical Study. In Pro-ceedings of ACM CHI'01 Conference on Human Factors in Computing Systems, 2001, pp. 349--356.]]
[24]
Ware, C., and Mikaelian, H.T. An Evaluation of an Eye Tracker as a Device for Computer Input. In Proceedings of the ACM CHI + GI'87 Human Factors in Computing Systems Conference. Toronto, Canada: ACM Press, 1987, pp. 183--188.]]
[25]
Zhai, S. What's in the Eyes for Attentive Input. Special Issue on Attentive User Interfaces, Communications of ACM Vol. 46, No. 3, 2003, pp. 34--39.]]
[26]
Zhai, S., Morimoto, C., and Ihde, S. Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the ACM CHI'99 Conference on Human Factors in Computing Systems, 1999, pp. 246--253.]]

Cited By

View all
  • (2024)GEARS: Generalizable Multi-Purpose Embeddings for Gaze and Hand Data in VR InteractionsProceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization10.1145/3627043.3659551(279-289)Online publication date: 22-Jun-2024
  • (2024)Attention-Aware Visualization: Tracking and Responding to User Perception Over TimeIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345630031:1(1017-1027)Online publication date: 9-Sep-2024
  • (2023)Learning Individualized Automatic Content Magnification in Gaze-based Interaction2023 IEEE International Symposium on Multimedia (ISM)10.1109/ISM59092.2023.00054(282-286)Online publication date: 11-Dec-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '05: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
April 2005
928 pages
ISBN:1581139985
DOI:10.1145/1054972
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 April 2005

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. alternative input
  2. attentive user interfaces
  3. eye tracking

Qualifiers

  • Article

Conference

CHI05
Sponsor:

Acceptance Rates

CHI '05 Paper Acceptance Rate 93 of 372 submissions, 25%;
Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)48
  • Downloads (Last 6 weeks)7
Reflects downloads up to 25 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)GEARS: Generalizable Multi-Purpose Embeddings for Gaze and Hand Data in VR InteractionsProceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization10.1145/3627043.3659551(279-289)Online publication date: 22-Jun-2024
  • (2024)Attention-Aware Visualization: Tracking and Responding to User Perception Over TimeIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345630031:1(1017-1027)Online publication date: 9-Sep-2024
  • (2023)Learning Individualized Automatic Content Magnification in Gaze-based Interaction2023 IEEE International Symposium on Multimedia (ISM)10.1109/ISM59092.2023.00054(282-286)Online publication date: 11-Dec-2023
  • (2022)The Eye in Extended Reality: A Survey on Gaze Interaction and Eye Tracking in Head-worn Extended RealityACM Computing Surveys10.1145/349120755:3(1-39)Online publication date: 25-Mar-2022
  • (2022)Evaluating the Performance of Machine Learning Algorithms in Gaze Gesture Recognition SystemsIEEE Access10.1109/ACCESS.2021.313615310(1020-1035)Online publication date: 2022
  • (2022)Learnability evaluation of the markup language for designing applications controlled by gazeInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2022.102863165:COnline publication date: 1-Sep-2022
  • (2021)Multirobot Confidence and Behavior Modeling: An Evaluation of Semiautonomous Task Performance and EfficiencyRobotics10.3390/robotics1002007110:2(71)Online publication date: 17-May-2021
  • (2021)Influence of keyboard layout and feedback type in eye-typing tasks: a comparative studyProceedings of the 32nd European Conference on Cognitive Ergonomics10.1145/3452853.3452887(1-7)Online publication date: 26-Apr-2021
  • (2020)Gaze-Adaptive Lenses for Feature-Rich Information SpacesACM Symposium on Eye Tracking Research and Applications10.1145/3379155.3391323(1-8)Online publication date: 2-Jun-2020
  • (2019)Understanding pointing for workspace tasks on large high-resolution displaysProceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia10.1145/3365610.3365636(1-9)Online publication date: 26-Nov-2019
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media