skip to main content
10.1145/3025453.3025455acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

GazeEverywhere: Enabling Gaze-only User Interaction on an Unmodified Desktop PC in Everyday Scenarios

Published: 02 May 2017 Publication History

Abstract

Eye tracking is becoming more and more affordable, and thus gaze has the potential to become a viable input modality for human-computer interaction. We present the GazeEverywhere solution that can replace the mouse with gaze control by adding a transparent layer on top of the system GUI. It comprises three parts: i) the SPOCK interaction method that is based on smooth pursuit eye movements and does not suffer from the Midas touch problem; ii) an online recalibration algorithm that continuously improves gaze-tracking accuracy using the SPOCK target projections as reference points; and iii) an optional hardware setup utilizing head-up display technology to project superimposed dynamic stimuli onto the PC screen where a software modification of the system is not feasible. In validation experiments, we show that GazeEverywhere's throughput according to ISO 9241-9 was improved over dwell time based interaction methods and nearly reached trackpad level. Online recalibration reduced interaction target ('button') size by about 25%. Finally, a case study showed that users were able to browse the internet and successfully run Wikirace using gaze only, without any plug-ins or other modifications.

References

[1]
1. Hyunjin Ahn, Jaeseok Yoon, Gulji Chung, Kibum Kim, Jiyeon Ma, Hyunbin Choi, Donguk Jung, and Joongseek Lee. 2015. DOWELL: Dwell-time Based Smartphone Control Solution for People with Upper Limb Disabilities. In Proc. of the SIGCHI Conf. Ext. Abst. on Human Factors in Computing Systems (CHI EA '15). ACM, New York, NY, USA, 887--892.
[2]
2. Patrick Bader, Niels Henze, Nora Broy, and Katrin Wolf. 2016. The Effect of Focus Cues on Separation of Information Layers. In Proc. of the SIGCHI Conf. on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 509--514.
[3]
3. Nikolaus Bee and Elisabeth André. 2008. Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze. In Proc. of the 4th IEEE tutorial and research workshop on Perception and Interactive Technologies for Speech-Based Systems: Perception in Multimodal Dialogue Systems (PIT '08). Springer, 111--122.
[4]
4. Darrell S. Best and Andrew T. Duchowski. 2016. A rotary dial for gaze-based PIN entry. In Proc. of the 9th Biennial ACM Symp. on Eye Tracking Research & Applications (ETRA '16). ACM, 69--76.
[5]
5. Dietlind Helene Cymek, Antje Christine Venjakob, Stefan Ruff, Otto Hans-Martin Lutz, Simon Hofmann, and Matthias Roetting. 2014. Entering PIN codes by smooth pursuit eye movements. Journal of Eye Movement Research 7, 4 (2014), 1--11.
[6]
6. Michael Dorr, Martin Böhme, Thomas Martinetz, and Erhardt Barth. 2007. Gaze beats mouse: A case study. In Proc. of the 3rd Conf. on Communication by Gaze Interaction (COGAIN '07). 16--19.
[7]
7. Sarah A. Douglas, Arthur E. Kirkpatrick, and I. Scott MacKenzie. 1999. Testing pointing device performance and user assessment with the ISO 9241, Part 9 standard. In Proc. of the SIGCHI Conf. on Human Factors in Computing Systems (CHI '99). ACM, New York, NY, USA, 215--222.
[8]
8. Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the Computer Using Gaze Gestures. In Proc. of the 11th IFIP Int. Conf. on Human-Computer Interaction. INTERACT '07, Vol. 4663. Springer, Berlin, 475--488.
[9]
9. Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements. In Proc. of the 28th Annual ACM Symp. on User Interface Software and Technology (UIST '15). ACM, New York, NY, USA, 457--466.
[10]
10. Jens R. Helmert, Sebastian Pannasch, and Boris M. Velichkovsky. 2008. Influences of dwell time and cursor control on the performance in gaze driven typing. Journal of Eye Movement Research 2, 4 (2008), 1--8.
[11]
11. Anke Huckauf and Mario H. Urbina. 2011. Object Selection in Gaze Controlled Systems: What You Don't Look at is What You Get. ACM Transactions on Applied Perception 8, 2 (2011), 13:1--13:14.
[12]
12. Robert J. K. Jacob. 1990. What you look at is what you get: eye movement-based interaction techniques. In Proc. of the SIGCHI Conf. on Human Factors in Computing Systems (CHI '90). ACM, New York, NY, USA, 11--18.
[13]
13. Chris Lankford. 2000. Effective Eye-gaze Input into Windows. In Proc. of the 1st Biennial Symp. on Eye Tracking Research & Applications (ETRA '00). ACM, New York, NY, USA, 23--27.
[14]
14. Christof Lutteroth, Moiz Penkar, and Gerald Weber. 2015. Gaze vs. Mouse: A Fast and Accurate Gaze-Only Click Alternative. In Proc. of the 28th Annual ACM Symp. on User Interface Software and Technology (UIST '15). ACM, New York, NY, USA, 385--394.
[15]
15. I. Scott MacKenzie. 2010. An eye on input. In Proc. of the 6th Biennial ACM Symp. on Eye Tracking Research & Applications (ETRA '10). ACM, 11--12.
[16]
16. John Magee, Torsten Felzer, and I. Scott MacKenzie. 2015. Camera Mouse + ClickerAID: Dwell vs. Single-Muscle Click Actuation in Mouse-Replacement Interfaces. In Proc. of the 16th Int. Conf. on Human-Computer Interaction (LNCS), Vol. 9170. 74--84.
[17]
17. Päivi Majaranta, Ulla-Kaija Ahola, and Oleg Špakov. 2009. Fast Gaze Typing with an Adjustable Dwell Time. In Proc. of the SIGCHI Conf. on Human Factors in Computing Systems (CHI '09). ACM, New York, NY, USA, 357--360.
[18]
18. Päivi Majaranta, Anne Aula, and Kari-Jouko Räihä. 2004. Effects of Feedback on Eye Typing with a Short Dwell Time. In Proc. of the 3rd Biennial ACM Symp. on Eye Tracking Research & Applications (ETRA '04). ACM, 139--146.
[19]
19. Emilie Møllenbach, John Paulin Hansen, and Martin Lillholm. 2013. Eye Movements in Gaze Interaction. Journal of Eye Movement Research 6, 1 (2013), 1--15.
[20]
20. Emilie Møllenbach, John Paulin Hansen, Martin Lillholm, and Alastair G. Gale. 2009. Single Stroke Gaze Gestures. In Proc. of the SIGCHI Conf. on Human Factors in Computing Systems (CHI EA '09). ACM, New York, NY, USA, 4555--4560.
[21]
21. Diogo Pedrosa, Maria da Graça Pimentel, and Khai N. Truong. 2015. Filteryedping: A Dwell-Free Eye Typing Technique. In Proc. of the SIGCHI Conf. Ext. Abst. on Human Factors in Computing Systems (CHI EA '15). ACM, New York, NY, USA, 303--306.
[22]
22. Abdul Moiz Penkar, Christof Lutteroth, and Gerald Weber. 2012. Designing for the Eye: Design Parameters for Dwell in Gaze Interaction. In Proc. of the 24th Australian Computer-Human Interaction Conf. (OzCHI '12). ACM, New York, NY, USA, 479--488.
[23]
23. D. A. Ray. 1969. Head-Up Display. The Aeronautical Journal 73, 703 (1969), 622--624.
[24]
24. Dario D. Salvucci and John R. Anderson. 2000. Intelligent Gaze-added Interfaces. In Proc. of the SIGCHI Conf. on Human Factors in Computing Systems (CHI '00). ACM, New York, NY, USA, 273--280.
[25]
25. Simon Schenk, Philipp Tiefenbacher, Gerhard Rigoll, and Michael Dorr. 2016. SPOCK: A Smooth Pursuit Oculomotor Control Kit. In Proc. of the SIGCHI Conf. Ext. Abst. on Human Factors in Computing Systems (CHI EA '16). ACM, New York, NY, USA, 2681--2687.
[26]
26. Donald Shepard. 1968. A Two-dimensional Interpolation Function for Irregularly-spaced Data. In Proc. of the 1968 23rd ACM Nat. Conf. (ACM '68). ACM, New York, NY, USA, 517--524.
[27]
27. Henrik Skovsgaard, Julio C. Mateo, John M. Flach, and John Paulin Hansen. 2010. Small-target Selection with Gaze Alone. In Proc. of the 6th Biennial ACM Symp. on Eye Tracking Research & Applications (ETRA '10). ACM, 145--148.
[28]
28. Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proc. of the 2013 ACM Joint Conf. on Pervasive and Ubiquitous Computing (UbiComp '13). ACM, New York, NY, USA, 439--448.
[29]
29. David J. Ward and David J. C. MacKay. 2002. Fast Hands-free Writing by Gaze Direction. Nature 418 (2002), 838.
[30]
30. Jacob O. Wobbrock, Kristen Shinohara, and Alex Jansen. 2011. The effects of task dimensionality, endpoint deviation, throughput calculation, and experiment design on pointing measures and models. In Proc. of the SIGCHI Conf. on Human Factors in Computing Systems (CHI '11). ACM, New York, NY, USA, 1639--1648.
[31]
31. Xuan Zhang and I. Scott MacKenzie. 2007. Evaluating eye tracking with ISO 9241-part 9. In Proc. of the 12th Int. Conf. of HCI Intelligent Multimodal Interaction Environments, Part III (HCII '07). 779--788.

Cited By

View all
  • (2024)Enhancing Fixation and Pursuit: Optimizing Field of View and Number of Targets for Selection Performance in Virtual RealityInternational Journal of Human–Computer Interaction10.1080/10447318.2024.231388841:2(1221-1233)Online publication date: 15-Feb-2024
  • (2024)Eye Tracking Review: Importance, Tools, and ApplicationsEmerging Trends and Applications in Artificial Intelligence10.1007/978-3-031-56728-5_32(383-394)Online publication date: 30-Apr-2024
  • (2023)Pactolo Bar: An Approach to Mitigate the Midas Touch Problem in Non-Conventional InteractionSensors10.3390/s2304211023:4(2110)Online publication date: 13-Feb-2023
  • Show More Cited By

Index Terms

  1. GazeEverywhere: Enabling Gaze-only User Interaction on an Unmodified Desktop PC in Everyday Scenarios

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '17: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems
    May 2017
    7138 pages
    ISBN:9781450346559
    DOI:10.1145/3025453
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 02 May 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eye tracking
    2. gaze-based interaction
    3. mouse replacement
    4. smooth pursuit

    Qualifiers

    • Research-article

    Conference

    CHI '17
    Sponsor:

    Acceptance Rates

    CHI '17 Paper Acceptance Rate 600 of 2,400 submissions, 25%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)59
    • Downloads (Last 6 weeks)6
    Reflects downloads up to 13 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Enhancing Fixation and Pursuit: Optimizing Field of View and Number of Targets for Selection Performance in Virtual RealityInternational Journal of Human–Computer Interaction10.1080/10447318.2024.231388841:2(1221-1233)Online publication date: 15-Feb-2024
    • (2024)Eye Tracking Review: Importance, Tools, and ApplicationsEmerging Trends and Applications in Artificial Intelligence10.1007/978-3-031-56728-5_32(383-394)Online publication date: 30-Apr-2024
    • (2023)Pactolo Bar: An Approach to Mitigate the Midas Touch Problem in Non-Conventional InteractionSensors10.3390/s2304211023:4(2110)Online publication date: 13-Feb-2023
    • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
    • (2022)Evaluating Eye Movement Event Detection: A Review of the State of the ArtBehavior Research Methods10.3758/s13428-021-01763-755:4(1653-1714)Online publication date: 17-Jun-2022
    • (2022)Dwell Selection with ML-based Intent Prediction Using Only Gaze DataProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35503016:3(1-21)Online publication date: 7-Sep-2022
    • (2022)DEEP: 3D Gaze Pointing in Virtual Reality Leveraging Eyelid MovementProceedings of the 35th Annual ACM Symposium on User Interface Software and Technology10.1145/3526113.3545673(1-14)Online publication date: 29-Oct-2022
    • (2022)GazeDock: Gaze-Only Menu Selection in Virtual Reality using Auto-Triggering Peripheral Menu2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR51125.2022.00105(832-842)Online publication date: Mar-2022
    • (2021)Nosype: A Novel Nose-tip Tracking-based Text Entry System for Smartphone Users with Clinical Disabilities for Touch-based TypingProceedings of the 23rd International Conference on Mobile Human-Computer Interaction10.1145/3447526.3472054(1-16)Online publication date: 27-Sep-2021
    • (2021)Interaction With Gaze, Gesture, and Speech in a Flexibly Configurable Augmented Reality SystemIEEE Transactions on Human-Machine Systems10.1109/THMS.2021.309797351:5(524-534)Online publication date: Oct-2021
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media