skip to main content
10.1145/3314111.3319845acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Task-embedded online eye-tracker calibration for improving robustness to head motion

Authors Info & Claims
Published:25 June 2019Publication History

ABSTRACT

Remote eye trackers are widely used for screen-based interactions. They are less intrusive than head mounted eye trackers, but are generally quite sensitive to head movement. This leads to the requirement for frequent recalibration, especially in applications requiring accurate eye tracking. We propose here an online calibration method to compensate for head movements if estimates of the gaze targets are available. For example, in dwell-time based gaze typing it is reasonable to assume that for correct selections, the user's gaze target during the dwell-time was at the key center. We use this assumption to derive an eye-position dependent linear transformation matrix for correcting the measured gaze. Our experiments show that the proposed method significantly reduces errors over a large range of head movements.

References

  1. Jixu Chen and Qiang Ji. 2015. A probabilistic approach to online eye gaze tracking without explicit personal calibration. IEEE Transactions on Image Processing 24, 3 (2015), 1076--1086.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Zhaokang Chen and Bertram E Shi. 2018. Using Variable Dwell Time to Accelerate Gaze-Based Web Browsing with Two-Step Selection. International Journal of Human-Computer Interaction (2018), 1--16.Google ScholarGoogle Scholar
  3. Peter M Corcoran, Florin Nanu, Stefan Petrescu, and Petronel Bigioi. 2012. Real-time eye gaze tracking for gaming design and consumer electronics systems. IEEE Transactions on Consumer Electronics 58, 2 (2012).Google ScholarGoogle ScholarCross RefCross Ref
  4. Flavio L Coutinho and Carlos H Morimoto. 2013. Improving head movement tolerance of cross-ratio based eye trackers. International journal of computer vision 101, 3 (2013), 459--481. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Xujiong Dong, Haofei Wang, Zhaokang Chen, and Bertram E Shi. 2015. Hybrid brain computer interface via Bayesian integration of EEG and eye gaze. In 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER). IEEE, 150--153.Google ScholarGoogle ScholarCross RefCross Ref
  6. EyeFollower. 2019. EyeFollower @ONLINE. https://www.interactive-minds.com/eye-tracker/eyefollowerGoogle ScholarGoogle Scholar
  7. Dan Witzner Hansen, Javier San Agustin, and Arantxa Villanueva. 2010. Homography normalization for robust gaze estimation in uncalibrated setups. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. ACM, 13--20. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Dan Witzner Hansen and Qiang Ji. 2010. In the eye of the beholder: A survey of models for eyes and gaze. IEEE transactions on pattern analysis and machine intelligence 32, 3 (2010), 478--500. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Roy S. Hessels, Tim H. W. Cornelissen, Chantal Kemner, and Ignace T. C. Hooge. 2015. Qualitative tests of remote eyetracker recovery and performance during head rotation. Behavior Research Methods 47, 3 (01 Sep 2015), 848--859.Google ScholarGoogle Scholar
  10. Anthony J Hornof and Tim Halverson. 2002. Cleaning up systematic error in eye-tracking data by using required fixation locations. Behavior Research Methods, Instruments, & Computers 34, 4 (2002), 592--604.Google ScholarGoogle ScholarCross RefCross Ref
  11. Michael Xuelin Huang, Tiffany CK Kwok, Grace Ngai, Stephen CF Chan, and Hong Va Leong. 2016. Building a personalized, auto-calibrating eye tracker from user interactions. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 5169--5179. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Qiang Ji and Xiaojie Yang. 2002. Real-time eye, gaze, and face pose tracking for monitoring driver vigilance. Real-time imaging 8, 5 (2002), 357--377. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Dongwook Jung, Jong Lee, Su Gwon, Weiyuan Pan, Hyeon Lee, Kang Park, and Hyun-Cheol Kim. 2016. Compensation method of natural head movement for gaze tracking system using an ultrasonic sensor for distance measurement. Sensors 16, 1 (2016), 110.Google ScholarGoogle ScholarCross RefCross Ref
  14. Anuradha Kar and Peter Corcoran. 2017. A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. IEEE Access 5 (2017), 16495--16519.Google ScholarGoogle ScholarCross RefCross Ref
  15. Hyeon Chang Lee, Duc Thien Luong, Chul Woo Cho, Eui Chul Lee, and Kang Ryoung Park. 2010. Gaze tracking system at a distance for controlling IPTV. IEEE Transactions on Consumer Electronics 56, 4 (2010). Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. I Scott MacKenzie and R William Soukoreff. 2003. Phrase sets for evaluating text entry techniques. In CHI'03 extended abstracts on Human factors in computing systems. ACM, 754--755. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Päivi Majaranta, Ulla-Kaija Ahola, and Oleg Špakov. 2009. Fast gaze typing with an adjustable dwell time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 357--360. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Takashi Nagamatsu, Michiya Yamamoto, and Hiroshi Sato. 2010. MobiGaze: Development of a gaze interface for handheld mobile devices. In CHI'10 Extended Abstracts on Human Factors in Computing Systems. ACM, 3349--3354. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Diederick C Niehorster, Tim HW Cornelissen, Kenneth Holmqvist, Ignace TC Hooge, and Roy S Hessels. 2018. What to expect from your remote eye-tracker when participants are unrestrained. Behavior research methods 50, 1 (2018), 213--227.Google ScholarGoogle Scholar
  20. Takehiko Ohno and Naoki Mukawa. 2004. A free-head, simple calibration, gaze tracking system that enables gaze-based interaction. In Proceedings of the 2004 symposium on Eye tracking research & applications. ACM, 115--122. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Jimin Pi and Bertram E Shi. 2017. Probabilistic adjustment of dwell time for eye typing. In Human System Interactions (HSI), 2017 10th International Conference on. IEEE, 251--257.Google ScholarGoogle ScholarCross RefCross Ref
  22. Wayne J Ryan, Andrew T Duchowski, and Stan T Birchfield. 2008. Limbus/pupil switching for wearable eye tracking under variable lighting conditions. In Proceedings of the 2008 symposium on Eye tracking research & applications. ACM, 61--64. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Yusuke Sugano, Yasuyuki Matsushita, Yoichi Sato, and Hideki Koike. 2015. Appearance-based gaze estimation with online calibration from mouse operations. IEEE Transactions on Human-Machine Systems 45, 6 (2015), 750--760.Google ScholarGoogle ScholarCross RefCross Ref
  24. Tobiipro. 2019. Does head movement affect eye tracking results? @ON-LINE. https://www.tobiipro.com/learn-and-support/learn/eye-tracking-essentials/does-head-movements-affect-eye-tracking-results/Google ScholarGoogle Scholar
  25. Haofei Wang, Jimin Pi, Tong Qin, Shaojie Shen, and Bertram E Shi. 2018. SLAM-based localization of 3D gaze using a mobile eye tracker. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. ACM, 65. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Dong Hyun Yoo and Myung Jin Chung. 2005. A novel non-intrusive eye gaze estimation using cross-ratio under large head motion. Computer Vision and Image Understanding 98, 1 (2005), 25--51. Google ScholarGoogle ScholarCross RefCross Ref
  27. Yunfeng Zhang and Anthony J Hornof. 2011. Mode-of-disparities error correction of eye-tracking data. Behavior research methods 43, 3 (2011), 834--842.Google ScholarGoogle Scholar

Index Terms

  1. Task-embedded online eye-tracker calibration for improving robustness to head motion

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
      June 2019
      623 pages
      ISBN:9781450367097
      DOI:10.1145/3314111

      Copyright © 2019 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 25 June 2019

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate69of137submissions,50%

      Upcoming Conference

      ETRA '24
      The 2024 Symposium on Eye Tracking Research and Applications
      June 4 - 7, 2024
      Glasgow , United Kingdom

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader