ABSTRACT
Remote eye trackers are widely used for screen-based interactions. They are less intrusive than head mounted eye trackers, but are generally quite sensitive to head movement. This leads to the requirement for frequent recalibration, especially in applications requiring accurate eye tracking. We propose here an online calibration method to compensate for head movements if estimates of the gaze targets are available. For example, in dwell-time based gaze typing it is reasonable to assume that for correct selections, the user's gaze target during the dwell-time was at the key center. We use this assumption to derive an eye-position dependent linear transformation matrix for correcting the measured gaze. Our experiments show that the proposed method significantly reduces errors over a large range of head movements.
- Jixu Chen and Qiang Ji. 2015. A probabilistic approach to online eye gaze tracking without explicit personal calibration. IEEE Transactions on Image Processing 24, 3 (2015), 1076--1086.Google ScholarDigital Library
- Zhaokang Chen and Bertram E Shi. 2018. Using Variable Dwell Time to Accelerate Gaze-Based Web Browsing with Two-Step Selection. International Journal of Human-Computer Interaction (2018), 1--16.Google Scholar
- Peter M Corcoran, Florin Nanu, Stefan Petrescu, and Petronel Bigioi. 2012. Real-time eye gaze tracking for gaming design and consumer electronics systems. IEEE Transactions on Consumer Electronics 58, 2 (2012).Google ScholarCross Ref
- Flavio L Coutinho and Carlos H Morimoto. 2013. Improving head movement tolerance of cross-ratio based eye trackers. International journal of computer vision 101, 3 (2013), 459--481. Google ScholarDigital Library
- Xujiong Dong, Haofei Wang, Zhaokang Chen, and Bertram E Shi. 2015. Hybrid brain computer interface via Bayesian integration of EEG and eye gaze. In 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER). IEEE, 150--153.Google ScholarCross Ref
- EyeFollower. 2019. EyeFollower @ONLINE. https://www.interactive-minds.com/eye-tracker/eyefollowerGoogle Scholar
- Dan Witzner Hansen, Javier San Agustin, and Arantxa Villanueva. 2010. Homography normalization for robust gaze estimation in uncalibrated setups. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. ACM, 13--20. Google ScholarDigital Library
- Dan Witzner Hansen and Qiang Ji. 2010. In the eye of the beholder: A survey of models for eyes and gaze. IEEE transactions on pattern analysis and machine intelligence 32, 3 (2010), 478--500. Google ScholarDigital Library
- Roy S. Hessels, Tim H. W. Cornelissen, Chantal Kemner, and Ignace T. C. Hooge. 2015. Qualitative tests of remote eyetracker recovery and performance during head rotation. Behavior Research Methods 47, 3 (01 Sep 2015), 848--859.Google Scholar
- Anthony J Hornof and Tim Halverson. 2002. Cleaning up systematic error in eye-tracking data by using required fixation locations. Behavior Research Methods, Instruments, & Computers 34, 4 (2002), 592--604.Google ScholarCross Ref
- Michael Xuelin Huang, Tiffany CK Kwok, Grace Ngai, Stephen CF Chan, and Hong Va Leong. 2016. Building a personalized, auto-calibrating eye tracker from user interactions. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 5169--5179. Google ScholarDigital Library
- Qiang Ji and Xiaojie Yang. 2002. Real-time eye, gaze, and face pose tracking for monitoring driver vigilance. Real-time imaging 8, 5 (2002), 357--377. Google ScholarDigital Library
- Dongwook Jung, Jong Lee, Su Gwon, Weiyuan Pan, Hyeon Lee, Kang Park, and Hyun-Cheol Kim. 2016. Compensation method of natural head movement for gaze tracking system using an ultrasonic sensor for distance measurement. Sensors 16, 1 (2016), 110.Google ScholarCross Ref
- Anuradha Kar and Peter Corcoran. 2017. A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. IEEE Access 5 (2017), 16495--16519.Google ScholarCross Ref
- Hyeon Chang Lee, Duc Thien Luong, Chul Woo Cho, Eui Chul Lee, and Kang Ryoung Park. 2010. Gaze tracking system at a distance for controlling IPTV. IEEE Transactions on Consumer Electronics 56, 4 (2010). Google ScholarDigital Library
- I Scott MacKenzie and R William Soukoreff. 2003. Phrase sets for evaluating text entry techniques. In CHI'03 extended abstracts on Human factors in computing systems. ACM, 754--755. Google ScholarDigital Library
- Päivi Majaranta, Ulla-Kaija Ahola, and Oleg Špakov. 2009. Fast gaze typing with an adjustable dwell time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 357--360. Google ScholarDigital Library
- Takashi Nagamatsu, Michiya Yamamoto, and Hiroshi Sato. 2010. MobiGaze: Development of a gaze interface for handheld mobile devices. In CHI'10 Extended Abstracts on Human Factors in Computing Systems. ACM, 3349--3354. Google ScholarDigital Library
- Diederick C Niehorster, Tim HW Cornelissen, Kenneth Holmqvist, Ignace TC Hooge, and Roy S Hessels. 2018. What to expect from your remote eye-tracker when participants are unrestrained. Behavior research methods 50, 1 (2018), 213--227.Google Scholar
- Takehiko Ohno and Naoki Mukawa. 2004. A free-head, simple calibration, gaze tracking system that enables gaze-based interaction. In Proceedings of the 2004 symposium on Eye tracking research & applications. ACM, 115--122. Google ScholarDigital Library
- Jimin Pi and Bertram E Shi. 2017. Probabilistic adjustment of dwell time for eye typing. In Human System Interactions (HSI), 2017 10th International Conference on. IEEE, 251--257.Google ScholarCross Ref
- Wayne J Ryan, Andrew T Duchowski, and Stan T Birchfield. 2008. Limbus/pupil switching for wearable eye tracking under variable lighting conditions. In Proceedings of the 2008 symposium on Eye tracking research & applications. ACM, 61--64. Google ScholarDigital Library
- Yusuke Sugano, Yasuyuki Matsushita, Yoichi Sato, and Hideki Koike. 2015. Appearance-based gaze estimation with online calibration from mouse operations. IEEE Transactions on Human-Machine Systems 45, 6 (2015), 750--760.Google ScholarCross Ref
- Tobiipro. 2019. Does head movement affect eye tracking results? @ON-LINE. https://www.tobiipro.com/learn-and-support/learn/eye-tracking-essentials/does-head-movements-affect-eye-tracking-results/Google Scholar
- Haofei Wang, Jimin Pi, Tong Qin, Shaojie Shen, and Bertram E Shi. 2018. SLAM-based localization of 3D gaze using a mobile eye tracker. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. ACM, 65. Google ScholarDigital Library
- Dong Hyun Yoo and Myung Jin Chung. 2005. A novel non-intrusive eye gaze estimation using cross-ratio under large head motion. Computer Vision and Image Understanding 98, 1 (2005), 25--51. Google ScholarCross Ref
- Yunfeng Zhang and Anthony J Hornof. 2011. Mode-of-disparities error correction of eye-tracking data. Behavior research methods 43, 3 (2011), 834--842.Google Scholar
Index Terms
- Task-embedded online eye-tracker calibration for improving robustness to head motion
Recommendations
Head movement estimation for wearable eye tracker
ETRA '04: Proceedings of the 2004 symposium on Eye tracking research & applicationsIn the study of eye movements in natural tasks, where subjects are able to freely move in their environment, it is desirable to capture a video of the surroundings of the subject not limited to a small field of view as obtained by the scene camera of an ...
Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection
UIST '19: Proceedings of the 32nd Annual ACM Symposium on User Interface Software and TechnologyEye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and ...
Measuring the task-evoked pupillary response with a remote eye tracker
ETRA '08: Proceedings of the 2008 symposium on Eye tracking research & applicationsThe pupil-measuring capability of video eye trackers can detect the task-evoked pupillary response: subtle changes in pupil size which indicate cognitive load. We performed several experiments to measure cognitive load using a remote video eye tracker, ...
Comments