skip to main content
10.1145/3314111.3319818acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Time- and space-efficient eye tracker calibration

Published: 25 June 2019 Publication History

Abstract

One of the obstacles to bring eye tracking technology to everyday human computer interactions is the time consuming calibration procedure. In this paper we investigate a novel calibration method based on smooth pursuit eye movement. The method uses linear regression to calculate the calibration mapping. The advantage is that users can perform the calibration quickly in a few seconds and only use a small calibration area to cover a large tracking area. We first describe the theoretical background on establishing a calibration mapping and discuss differences of calibration methods used. We then present a user study comparing the new regression-based method with a classical nine-point and with other pursuit-based calibrations. The results show the proposed method is fully functional, quick, and enables accurate tracking of a large area. The method has the potential to be integrated into current eye tracking systems to make them more usable in various use cases.

References

[1]
2018. Tobii SDK Calibration. http://developer.tobiipro.com/commonconcepts/calibration.html. Accessed: 2018-12-14.
[2]
Pieter Blignaut. 2017. Using smooth pursuit calibration for difficult-to-calibrate participants. Journal of Eye Movement Research 10, 4 (2017). https://bop.unibe.ch/JEMR/article/view/3676
[3]
Richard A. Bolt. 1981. Gaze-orchestrated Dynamic Windows. SIGGRAPH Comput. Graph. 15, 3 (Aug. 1981), 109--119.
[4]
Feridun M. Celebi, Elizabeth S. Kim, Quan Wang, Carla A. Wall, and Frederick Shic. 2014. A Smooth Pursuit Calibration Technique. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, New York, NY, USA, 377--378.
[5]
Dietlind Helene Cymek, Antje Christine Venjakob, Stefan Ruff, Otto Hans-Martin Lutz, Simon Hofmann, and Matthias Roetting. 2014. Entering PIN codes by smooth pursuit eye movements. Journal of Eye Movement Research 7, 4 (2014). https://bop.unibe.ch/index.php/JEMR/article/view/2384
[6]
Heiko Drewes, Alexander De Luca, and Albrecht Schmidt. 2007. Eye-gaze Interaction for Mobile Phones. In Proceedings of the 4th International Conference on Mobile Technology, Applications, and Systems and the 1st International Symposium on Computer Human Interaction in Mobile Technology (Mobility '07). ACM, New York, NY, USA, 364--371.
[7]
Heiko Drewes, Mohamed Khamis, and Florian Alt. 2018. Smooth Pursuit Target Speeds and Trajectories. In Proceedings of the 17th International Conference on Mobile and Ubiquitous Multimedia (MUM 2018). ACM, New York, NY, USA, 139--146.
[8]
Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the Computer Using Gaze Gestures. In Proceedings of the 11th IFIP TC 13 International Conference on Human-computer Interaction - Volume Part II (INTERACT'07). Springer-Verlag, Berlin, Heidelberg, 475--488. http://dl.acm.org/citation.cfm?id=1778331.1778385
[9]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches Using Smooth Pursuit Eye Movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 457--466.
[10]
David R Flatla, Carl Gutwin, Lennart E Nacke, Scott Bateman, and Regan L Mandryk. 2011. Calibration games: making calibration tasks enjoyable by adding motivating game elements. In Proceedings of the 24th annual ACM symposium on User interface software and technology. ACM, 403--412.
[11]
Argenis Ramirez Gomez and Hans Gellersen. 2018. Smooth-i: Smart Re-calibration Using Smooth Pursuit Eye Movements. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA '18). ACM, New York, NY, USA, Article 10, 5 pages.
[12]
Elias Daniel Guestrin and Moshe Eizenman. 2008. Remote Point-of-gaze Estimation Requiring a Single-point Calibration for Applications with Infants. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA '08). ACM, New York, NY, USA, 267--274.
[13]
Rob Jacob and Sophie Stellmach. 2016. What You Look at is What You Get: Gaze-based User Interfaces. interactions 23, 5 (Aug. 2016), 62--65.
[14]
Robert JK Jacob and Keith S Karn. 2003. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In The mind's eye. Elsevier, 573--605.
[15]
Mohamed Khamis, Florian Alt, and Andreas Bulling. 2016a. Challenges and Design Space of Gaze-enabled Public Displays. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (UbiComp '16). ACM, New York, NY, USA, 1736--1745.
[16]
Mohamed Khamis, Ozan Saltuk, Alina Hang, Katharina Stolz, Andreas Bulling, and Florian Alt. 2016b. TextPursuits: Using Text for Pursuits-based Interaction and Calibration on Public Displays. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16). ACM, New York, NY, USA, 274--285.
[17]
Vassilios Krassanakis, Vassiliki Filippakopoulou, and Byron Nakos. 2016. Detection of moving point symbols on cartographic backgrounds. Journal of Eye Movement Research 9, 2 (2016).
[18]
Carlos Morimoto and Marcio R.M. Mimica. 2005. Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding 98 (04 2005), 4--24.
[19]
Michaela Murauer, Michael Haslgrübler, and Alois Ferscha. 2018. Natural Pursuits for Eye Tracker Calibration. In Proceedings of the 5th International Workshop on Sensor-based Activity Recognition and Interaction (iWOAR '18). ACM, New York, NY, USA, Article 3, 10 pages.
[20]
Takehiko Ohno and Naoki Mukawa. 2004. A free-head, simple calibration, gaze tracking system that enables gaze-based interaction. In Proceedings of the 2004 symposium on Eye tracking research & applications. ACM, 115--122.
[21]
Ken Pfeuffer, Melodie Vidal, Jayson Turner, Andreas Bulling, and Hans Gellersen. 2013. Pursuit Calibration: Making Gaze Calibration Less Tedious and More Flexible. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST '13). ACM, New York, NY, USA, 261--270.
[22]
Argenis Ramirez-Gomez and Hans Gellersen. 2017. GazeBall: Leveraging Natural Gaze Behavior for Continuous Re-calibration in Gameplay. Journal of Eye Movement Research 10, 6 (2017).
[23]
Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2017. CalibMe: Fast and Unsupervised Eye Tracker Calibration for Gaze-Based Pervasive Human-Computer Interaction. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, New York, NY, USA, 2594--2605.
[24]
Susan K. Schnipke and Marc W. Todd. 2000. Trials and Tribulations of Using an Eye-tracking System. In CHI '00 Extended Abstracts on Human Factors in Computing Systems (CHI EA '00). ACM, New York, NY, USA, 273--274.
[25]
Yusuke Sugano and Andreas Bulling. 2015. Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 363--372.
[26]
Subarna Tripathi and Brian Guenter. 2017. A Statistical Approach to Continuous Self-Calibrating Eye Gaze Tracking for Head-Mounted Virtual Reality Systems. 862--870.
[27]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013a. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '13). ACM, New York, NY, USA, 439--448.
[28]
Mélodie Vidal, Ken Pfeuffer, Andreas Bulling, and Hans W. Gellersen. 2013b. Pursuits: Eye-based Interaction with Moving Targets. In CHI '13 Extended Abstracts on Human Factors in Computing Systems (CHI EA '13). ACM, New York, NY, USA, 3147--3150.
[29]
Arantxa Villanueva and Rafael Cabeza. 2008. A novel gaze estimation system with one calibration point. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 38, 4 (2008), 1123--1138.
[30]
Arantxa Villanueva, Rafael Cabeza, and Sonia Porta. 2004. Eye Tracking System Model with Easy Calibration. In Proceedings of the 2004 Symposium on Eye Tracking Research & Applications (ETRA '04). ACM, New York, NY, USA, 55--55.
[31]
Yanxia Zhang, Andreas Bulling, and Hans Gellersen. 2013. SideWays: A Gaze Interface for Spontaneous Interaction with Situated Displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 851--860.
[32]
Yanxia Zhang, Andreas Bulling, and Hans Gellersen. 2014a. Pupil-canthi-ratio: a calibration-free method for tracking horizontal gaze direction. In Proc. of the 2014 International Working Conference on Advanced Visual Interfaces (AVI 14) (2014-05-27). ACM, New York, NY, USA, 129--132.
[33]
Yanxia Zhang, Jörg Müller, Ming Ki Chong, Andreas Bulling, and Hans Gellersen. 2014b. GazeHorizon: Enabling Passers-by to Interact with Public Displays by Gaze. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '14). ACM, New York, NY, USA, 559--563.

Cited By

View all
  • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
  • (2024)CalibRead: Unobtrusive Eye Tracking Calibration from Natural Reading BehaviorProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997378:4(1-30)Online publication date: 21-Nov-2024
  • (2024)A review on visible-light eye-tracking methods based on a low-cost cameraJournal of Ambient Intelligence and Humanized Computing10.1007/s12652-024-04760-815:4(2381-2397)Online publication date: 14-Mar-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
June 2019
623 pages
ISBN:9781450367097
DOI:10.1145/3314111
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 25 June 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. eye tracker calibration
  2. linear regression
  3. smooth pursuits

Qualifiers

  • Research-article

Conference

ETRA '19

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)93
  • Downloads (Last 6 weeks)12
Reflects downloads up to 12 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
  • (2024)CalibRead: Unobtrusive Eye Tracking Calibration from Natural Reading BehaviorProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997378:4(1-30)Online publication date: 21-Nov-2024
  • (2024)A review on visible-light eye-tracking methods based on a low-cost cameraJournal of Ambient Intelligence and Humanized Computing10.1007/s12652-024-04760-815:4(2381-2397)Online publication date: 14-Mar-2024
  • (2024)Multimodal Drivers of Attention Interruption to Baby Product Video AdsPattern Recognition10.1007/978-3-031-78104-9_21(303-318)Online publication date: 2-Dec-2024
  • (2024)Using Cockpit Interactions for Implicit Eye-Tracking Calibration in a Flight SimulatorComputer Vision, Imaging and Computer Graphics Theory and Applications10.1007/978-3-031-66743-5_12(256-270)Online publication date: 22-Aug-2024
  • (2023)Eye-tracking control of an adjustable electric bed: construction and validation by immobile patients with multiple sclerosisJournal of NeuroEngineering and Rehabilitation10.1186/s12984-023-01193-w20:1Online publication date: 9-Jun-2023
  • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
  • (2023)Eye Tracking in Virtual Reality: a Broad Review of Applications and ChallengesVirtual Reality10.1007/s10055-022-00738-z27:2(1481-1505)Online publication date: 18-Jan-2023
  • (2022)User Perception of Smooth Pursuit Target Speed2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3529234(1-7)Online publication date: 8-Jun-2022
  • (2022)Usability of the super-vowel for gaze-based text entry2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3529231(1-5)Online publication date: 8-Jun-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media