skip to main content
10.1145/3379156.3391336acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

A Psychophysics-inspired Model of Gaze Selection Performance

Published: 02 June 2020 Publication History

Abstract

Eye gaze promises to be a fast and intuitive way of interacting with technology. Importantly, the performance of a gaze selection paradigm depends on the eye tracker used: Higher tracking accuracy allows for selection of smaller targets, and higher precision and sampling rate allow for faster and more robust interaction. Here we present a novel approach to predict the minimal eye tracker specifications required for gaze-based selection. We quantified selection performance for targets of different sizes while recording high-fidelity gaze data. Selection performance across target sizes was well modeled by a sigmoid similar to a psychometric function. We then simulated lower tracker fidelity by adding noise, a constant spatial bias, or temporal sub-sampling of the recorded data while re-fitting the model each time. Our approach can inform design by predicting performance for a given interface element and tracker fidelity or the minimal element size for a specific performance level.

References

[1]
W Becker. 1972. The control of eye movements in the saccadic system.Bibliotheca ophthalmologica: supplementa ad ophthalmologica 82 (1972), 233–243.
[2]
W Becker and AF Fuchs. 1969. Further properties of the human saccadic system: eye movements and correction saccades with and without visual fixation points. Vision research 9, 10 (1969), 1247–1258.
[3]
Jonas Blattgerste, Patrick Renner, and Thies Pfeiffer. 2018. Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views. In Proceedings of the Workshop on Communication by Gaze Interaction. ACM, 1.
[4]
Pieter Blignaut, Kenneth Holmqvist, Marcus Nyström, and Richard Dewhurst. 2014. Improving the accuracy of video-based eye tracking in real time through post-calibration regression. In Current Trends in Eye Tracking Research. Springer, 77–100.
[5]
Andreas Bulling and Hans Gellersen. 2010. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 9, 4 (2010), 8–12.
[6]
Andrew T Duchowski, Donald H House, Jordan Gestring, Robert Congdon, Lech Świrski, Neil A Dodgson, Krzysztof Krejtz, and Izabela Krejtz. 2014. Comparing estimated gaze depth in virtual and physical environments. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 103–110.
[7]
Carlos Elmadjian, Pushkar Shukla, Antonio Diaz Tula, and Carlos H Morimoto. 2018. 3D gaze estimation in the scene volume with a head-mounted eye tracker. In Proceedings of the Workshop on Communication by Gaze Interaction. ACM, 3.
[8]
Anna Maria Feit, Shane Williams, Arturo Toledo, Ann Paradiso, Harish Kulkarni, Shaun Kane, and Meredith Ringel Morris. 2017. Toward everyday gaze input: Accuracy and precision of eye tracking and implications for design. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 1118–1130.
[9]
Sven-Thomas Graupner, Michael Heubner, Sebastian Pannasch, and Boris M Velichkovsky. 2008. Evaluating requirements for gaze-based interaction in a see-through head mounted display. In Proceedings of the 2008 symposium on Eye tracking research & applications. ACM, 91–94.
[10]
Dan Witzner Hansen and Qiang Ji. 2010. In the eye of the beholder: A survey of models for eyes and gaze. IEEE transactions on pattern analysis and machine intelligence 32, 3(2010), 478–500.
[11]
John Paulin Hansen, Anders Sewerin Johansen, Dan Witzner Hansen, Kenji Itoh, and Satoru Mashino. 2003. Command without a click: Dwell time typing by mouse and gaze selections. In Proceedings of Human-Computer Interaction–INTERACT. 121–128.
[12]
Christopher M Harris and Daniel M Wolpert. 2006. The main sequence of saccades optimizes speed-accuracy trade-off. Biological cybernetics 95, 1 (2006), 21–29.
[13]
Kenneth Holmqvist, Marcus Nyström, and Fiona Mulvey. 2012. Eye tracker data quality: what it is and how to measure it. In Proceedings of the symposium on eye tracking research and applications. ACM, 45–52.
[14]
Robert JK Jacob. 1990. What you look at is what you get: eye movement-based interaction techniques. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 11–18.
[15]
Paivi Majaranta, Hirotaka Aoki, Mick Donegan, Dan Witzner Hansen, and John Paulin Hansen. 2011. Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies. (2011).
[16]
Päivi Majaranta and Andreas Bulling. 2014. Eye tracking and eye-based human–computer interaction. In Advances in physiological computing. Springer, 39–65.
[17]
Keith A. May and Joshua A. Solomon. 2013. Four Theorems on the Psychometric Function. PLOS ONE 8, 10 (10 2013), 1–34. https://doi.org/10.1371/journal.pone.0074815
[18]
George W McConkie. 1981. Evaluating and reporting data quality in eye movement research. Behavior Research Methods & Instrumentation 13, 2(1981), 97–106.
[19]
Jacob L Orquin and Kenneth Holmqvist. 2018. Threats to the validity of eye-movement research in psychology. Behavior research methods 50, 4 (2018), 1645–1656.
[20]
C Prablanc, D Masse, and JF Echallier. 1978. Error-correcting mechanisms in large saccades. Vision research 18, 5 (1978), 557–560.
[21]
Eyal M Reingold. 2014. Eye tracking research and technology: Towards objective measurement of data quality. Visual cognition 22, 3-4 (2014), 635–652.
[22]
Immo Schuetz, T. Scott Murdison, Kevin J. MacKenzie, and Marina Zannoli. 2019. An Explanation of Fitts’ Law-like Performance in Gaze-Based Selection Tasks Using a Psychophysics Approach. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). ACM, New York, NY, USA, Article 535, 13 pages. https://doi.org/10.1145/3290605.3300765
[23]
Vildan Tanriverdi and Robert JK Jacob. 2000. Interacting with eye movements in virtual environments. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM, 265–272.
[24]
Lore Thaler, Alexander C Schütz, Melvyn A Goodale, and Karl R Gegenfurtner. 2013. What is the best fixation target? The effect of target shape on stability of fixational eye movements. Vision Research 76(2013), 31–42.
[25]
Bernhard Treutwein. 1995. Adaptive psychophysical procedures. Vision research 35, 17 (1995), 2503–2522.
[26]
Robert J van Beers. 2007. The sources of variability in saccadic eye movements. Journal of Neuroscience 27, 33 (2007), 8757–8770.
[27]
AJ Van Opstal and JAM Van Gisbergen. 1989. Scatter in the metrics of saccades and properties of the collicular motor map. Vision research 29, 9 (1989), 1183–1196.
[28]
Colin Ware and Harutune H Mikaelian. 1987. An evaluation of an eye tracker as a device for computer input. In Acm sigchi bulletin, Vol. 17. ACM, 183–188.
[29]
Andrew B Watson and Denis G Pelli. 1983. QUEST: A Bayesian adaptive psychometric method. Perception & psychophysics 33, 2 (1983), 113–120.
[30]
Felix A Wichmann and N Jeremy Hill. 2001. The psychometric function: I. Fitting, sampling, and goodness of fit. Attention, Perception, & Psychophysics 63, 8 (2001), 1293–1313.
[31]
Chia-Chien Wu, Oh-Sang Kwon, and Eileen Kowler. 2010. Fitts’s Law and speed/accuracy trade-offs during sequences of saccades: Implications for strategies of saccadic planning. Vision Research 50, 21 (2010), 2142–2157.

Cited By

View all
  • (2025)Gaze Inputs for Targeting: The Eyes Have It, Not With a CursorInternational Journal of Human–Computer Interaction10.1080/10447318.2025.2453966(1-19)Online publication date: 5-Feb-2025
  • (2024)The Effect of Degraded Eye Tracking Accuracy on Interactions in VRProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656369(1-7)Online publication date: 4-Jun-2024
  • (2023)GazeRayCursor: Facilitating Virtual Reality Target Selection by Blending Gaze and Controller RaycastingProceedings of the 29th ACM Symposium on Virtual Reality Software and Technology10.1145/3611659.3615693(1-11)Online publication date: 9-Oct-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '20 Short Papers: ACM Symposium on Eye Tracking Research and Applications
June 2020
305 pages
ISBN:9781450371346
DOI:10.1145/3379156
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 June 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. accuracy
  2. eye tracking
  3. gaze interaction
  4. precision
  5. sampling rate
  6. selection

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Conference

ETRA '20

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

ETRA '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)36
  • Downloads (Last 6 weeks)5
Reflects downloads up to 05 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Gaze Inputs for Targeting: The Eyes Have It, Not With a CursorInternational Journal of Human–Computer Interaction10.1080/10447318.2025.2453966(1-19)Online publication date: 5-Feb-2025
  • (2024)The Effect of Degraded Eye Tracking Accuracy on Interactions in VRProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656369(1-7)Online publication date: 4-Jun-2024
  • (2023)GazeRayCursor: Facilitating Virtual Reality Target Selection by Blending Gaze and Controller RaycastingProceedings of the 29th ACM Symposium on Virtual Reality Software and Technology10.1145/3611659.3615693(1-11)Online publication date: 9-Oct-2023
  • (2023)Leveling the Playing Field: A Comparative Reevaluation of Unmodified Eye Tracking as an Input and Interaction Modality for VRIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.324705829:5(2269-2279)Online publication date: 1-May-2023
  • (2022)vexptoolbox: A software toolbox for human behavior studies using the Vizard virtual reality platformBehavior Research Methods10.3758/s13428-022-01831-655:2(570-582)Online publication date: 23-Mar-2022
  • (2022)Performance Analysis of Saccades for Primary and Confirmatory Target SelectionProceedings of the 28th ACM Symposium on Virtual Reality Software and Technology10.1145/3562939.3565619(1-12)Online publication date: 29-Nov-2022
  • (2022)My Eyes Hurt: Effects of Jitter in 3D Gaze Tracking2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW55335.2022.00070(310-315)Online publication date: Mar-2022
  • (2021)Relationship between Dwell-Time and Model Human Processor for Dwell-based Image SelectionACM Symposium on Applied Perception 202110.1145/3474451.3476240(1-5)Online publication date: 16-Sep-2021

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media