Skip to main content
Log in

Improving the robustness of gaze tracking under unconstrained illumination conditions

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

In human-computer interaction (HCI) applications, the performance degradation of gaze trackers in real-world environments is a critical issue. Typically, gaze trackers utilize the pupil center and corneal reflection (CR) obtained from an infrared (IR) light source to estimate the point of regard (POR). However, false CRs are often generated due to extraneous light sources such as sunlight or lamps. In this study, we propose a method of improving the robustness of gaze tracking under unconstrained illumination conditions. First, the proposed method generates a coded CR pattern by utilizing time-multiplexed IR light sources. Next, the CR candidates are detected in eye images, and their coordinates are compensated based on the head and eye movements of the user. Finally, true CRs are selected from the motion-compensated CR candidates by utilizing a novel cost function. Experimental results indicate that the gaze-tracking performance of the proposed method under various light conditions is considerably better than those of the conventional methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Agustin JS, Mateo JC, Hansen JP, Villanueva A (2009) Evaluation of the potential of gaze input for game interaction. Psychol 7(2):213–236

    Google Scholar 

  2. Arar NM, Gao H, Thiran J (2017) A regression-based user calibration framework for real-time gaze estimation. IEEE Trans Circuits and Syst Video Technol 27(12):2623–2638

    Google Scholar 

  3. Cho D, Yap W, Lee H, Lee I, Kim W (2012) Long range eye gaze tracking system for a large screen. IEEE Trans Consum Electron 58(4):1119–1128

    Google Scholar 

  4. Drewes H, De Luca A, Schmidt A (2007) Eye-gaze interaction for mobile phones. In: Proceedings of the 4th International Conference on Mobile Technology, Applications, and Systems and the 1st International Symposium on Computer Human Interaction in Mobile Technology, ACM, pp 364–371

  5. Fischler MA, Bolles RC (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6):381–395

    MathSciNet  Google Scholar 

  6. Flores M, Armingol JM, de la Escalera A (2011) Driver drowsiness detection system under infrared illumination for an intelligent vehicle. IET Intell Transp Syst 5:241–251

    Google Scholar 

  7. Fukumoto K, Ebisawa Y, Mochizuki K (2015) Detection of pupil and corneal reflection using high-speed camera for gaze detection under face intense illumination and a solution of glass reflection problem by improving light source. In: HCI, pp 475–480

  8. Gneo M, Schmid M, Conforto S, D’Alessio T (2012) A free geometry model-independent neural eye-gaze tracking system. J NeuroEng Rehabil 9(1):82

    Google Scholar 

  9. Guestrin ED, Eizenman M (2006) General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans Biomed Eng 53(6):1124–1133

    Google Scholar 

  10. Hansen DW, Roholm L, Ferreiros IG (2014) Robust glint detection through homography normalization. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp 91–94

  11. Heo H, Lee EC, Park KR, Kim CJ, Whang M (2010) A realistic game system using multi-modal user interfaces. IEEE Trans Consum Electron 56(3):1364–1372

    Google Scholar 

  12. Huang JB, Cai Q, Liu Z, Ahuja N, Zhang Z (2014) Towards accurate and robust cross-ratio based gaze trackers through learning from simulation. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp 75–82

  13. Iqbal N, Lee H, Lee S (2013) Smart user interface for mobile consumer devices using model-based eye-gaze estimation. IEEE Trans Consum Electron 59(1):161–166

    Google Scholar 

  14. Ji Q, Zhu Z, Lan P (2004) Real-time nonintrusive monitoring and prediction of driver fatigue. IEEE Trans Veh Technol 53(4):1052–1068

    Google Scholar 

  15. Kazemi V, Sullivan J (2014) One millisecond face alignment with an ensemble of regression trees. In: the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp 1867–1874

  16. Krafka K, Khosla A, Kellnhofer P, Kannan H, Bhandarkar S, Matusik W, Torralba A (2016) Eye tracking for everyone. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)

  17. Lee H, Lee W, Cho C, Gwon S, Park K, Lee H, Cha J (2013) Remote gaze tracking system on a large display. Sensors 13:13,439–13,463

    Google Scholar 

  18. Lee HC, Luong DT, Cho CW, Lee EC, Park KR (2010) Gaze tracking system at a distance for controlling iptv. IEEE Trans Consum Electron 56(4):2577–2583

    Google Scholar 

  19. Lee SJ, Jo J, Jung HG, Park KR, Kim J (2011) Real-time gaze estimator based on driver’s head orientation for forward collision warning system. IEEE Trans Intell Transp Syst 12(1):254–267

    Google Scholar 

  20. Li F, Kolakowski S, Pelz J (2007) Using structured illumination to enhance video-based eye tracking, vol 1, pp 373–376

  21. Li F, Munn S, Pelz J (2008) A model-based approach to video-based eye tracking. J Mod Opt 55(4-5):503–531

    Google Scholar 

  22. Ma Y, Soatto S, Koseck J, Sastry SS (2010) An Invitation to 3-D Vision: From Images to Geometric Models. Springer, New York

    Google Scholar 

  23. Morimoto CH, Mimica MR (2005) Eye gaze tracking techniques for interactive applications. Comput Vis Image Underst 98(1):4–24

    Google Scholar 

  24. Nagamatsu T, Yamamoto M, Sato H (2010) Mobigaze: Development of a gaze interface for handheld mobile devices. In: CHI ’10 Extended Abstracts on Human Factors in Computing Systems, ACM, pp 3349–3354

  25. Zhang X, Sugano Y, Fritz M, Bulling A (2015) Appearance-based gaze estimation in the wild. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)

  26. Zhang X, Sugano Y, Fritz M, Bulling A (2019) Mpiigaze: Real-world dataset and deep appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 41(1):162–175. https://doi.org/10.1109/TPAMI.2017.2778103

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No. 2014-3-00077, Development of global multi-target tracking and event prediction techniques based on real-time large-scale video analysis).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sung-Jea Ko.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Uhm, KH., Kang, MC., Kim, JY. et al. Improving the robustness of gaze tracking under unconstrained illumination conditions. Multimed Tools Appl 79, 20603–20616 (2020). https://doi.org/10.1007/s11042-020-08679-y

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-020-08679-y

Keywords

Navigation