Skip to main content
Log in

Screen-Light Decomposition Framework for Point-of-Gaze Estimation Using a Single Uncalibrated Camera and Multiple Light Sources

  • Published:
Journal of Mathematical Imaging and Vision Aims and scope Submit manuscript

Abstract

The use of a single uncalibrated camera is desirable for eye tracking to reduce the overall complexity and cost of the system. Quite often, at least one external light source is used to enhance image quality and generate a corneal reflection used as a reference point to estimate the point-of-gaze (PoG). Though the use of more than one light source has shown to enhance accuracy and robustness to head motion, it is unlikely that all corneal reflections appear in the eye images during natural eye movements. In this paper, we introduce the Screen-Light Decomposition (SLD) framework as a generalized model for PoG estimation using a single uncalibrated camera and a variable number of light sources. SLD synthesizes existing uncalibrated video-based eye trackers and can be used as a modeling tool to compare and design eye trackers. We have used the framework to design a novel eye-tracking technique, called SAGE, for single normalized space adaptive gaze estimation, that can gracefully degrade the gaze tracker performance when one or more corneal reflections are not detected, even during the calibration procedure. Results from an user experiment are presented to demonstrate its improved performance over other designs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

References

  1. Arar, N.M., Gao, H., Thiran, J.: A regression-based user calibration framework for real-time gaze estimation. IEEE Trans. Circuits Syst. Video Technol. 27(12), 2623–2638 (2017). https://doi.org/10.1109/TCSVT.2016.2595322

    Article  Google Scholar 

  2. Baltrušaitis, T., Robinson, P., Morency, L.: Openface: an open source facial behavior analysis toolkit. In: 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), pp. 1–10 (2016). https://doi.org/10.1109/WACV.2016.7477553

  3. Cerrolaza, J.J., Villanueva, A., Cabeza, R.: Taxonomic study of polynomial regressions applied to the calibration of video-oculographic systems. In: Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, ETRA ’08, pp. 259–266. ACM, New York, NY, USA (2008). https://doi.org/10.1145/1344471.1344530

  4. Cerrolaza, J.J., Villanueva, A., Cabeza, R.: Study of polynomial mapping functions in video-oculography eye trackers. ACM Trans. Comput. Hum. Interact. 19(2), 10:1–10:25 (2012). https://doi.org/10.1145/2240156.2240158

    Article  Google Scholar 

  5. Cheng, H., Liu, Y., Fu, W., Ji, Y., Yang, L., Zhao, Y., Yang, J.: Gazing point dependent eye gaze estimation. Pattern Recognit. 71(1), 36–44 (2017). https://doi.org/10.1016/j.patcog.2017.04.026

    Article  Google Scholar 

  6. Cortiñas, M., Chocarro, R., Villanueva, A.: Image, brand and price info: Do they always matter the same? In: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, ETRA ’19, pp. 1–8. ACM, New York, NY, USA (2019). https://doi.org/10.1145/3317960.3321616

  7. Coutinho, F.L., Morimoto, C.H.: Free head motion eye gaze tracking using a single camera and multiple light sources. In: Oliveira Neto, M.M.D., Carceroni, R.L. (eds.) Proceedings, pp. 171–178. IEEE Computer Society (2006)

  8. Coutinho, F.L., Morimoto, C.H.: A depth compensation method for cross-ratio based eye tracking. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, ETRA ’10, pp. 137–140. ACM, New York, NY, USA (2010). https://doi.org/10.1145/1743666.1743670

  9. Coutinho, F.L., Morimoto, C.H.: Augmenting the robustness of cross-ratio gaze tracking methods to head movement. In: Proceedings of the 2012 Symposium on Eye-Tracking Research & Applications, ETRA ’12, pp. 1–8 (2012)

  10. Coutinho, F.L., Morimoto, C.H.: Improving head movement tolerance of cross-ratio based eye trackers. Int. J. Comput. Vis. 101(3), 459–481 (2013)

    Article  Google Scholar 

  11. Faugeras, O.: Stratification of three-dimensional vision: projective, affine, and metric representations. J. Opt. Soc. Am. A 12(3), 465–484 (1995)

    Article  Google Scholar 

  12. Feit, A.M., Williams, S., Toledo, A., Paradiso, A., Kulkarni, H., Kane, S., Morris, M.R.: Toward everyday gaze input: Accuracy and precision of eye tracking and implications for design. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI ’17, pp. 1118–1130. ACM, New York, NY, USA (2017). https://doi.org/10.1145/3025453.3025599

  13. García-Dopico, A., Pérez, A., Pedraza, J.L., Córdoba, M.L.: Precise non-intrusive real-time gaze tracking system for embedded setups. Comput. Inform. 36(2), 257–282 (2017)

    Article  MathSciNet  Google Scholar 

  14. George, A., Routray, A.: Real-time eye gaze direction classification using convolutional neural network. In: 2016 International Conference on Signal Processing and Communications (SPCOM), pp. 1–5 (2016). https://doi.org/10.1109/SPCOM.2016.7746701

  15. Guestrin, E.D., Eizenman, M.: General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans. Biomed. Eng. 53(6), 1124–1133 (2006)

    Article  Google Scholar 

  16. Hansen, D.W., Agustin, J.S., Villanueva, A.: Homography normalization for robust gaze estimation in uncalibrated setups. In: Proceedings of the 2010 Symposium on Eye-Tracking Research&; Applications, ETRA ’10, pp. 13–20 (2010)

  17. Hansen, D.W., Ji, Q.: In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans. Pattern Anal. Mach. Intell. 32(3), 478–500 (2010). https://doi.org/10.1109/TPAMI.2009.30

    Article  Google Scholar 

  18. Hansen, D.W., Roholm, L., Ferreiros, I.G.: Robust glint detection through homography normalization. In: Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA ’14, pp. 91–94. ACM, New York, NY, USA (2014). https://doi.org/10.1145/2578153.2578165

  19. Hartley, R., Zisserman, A.: Multiple View Geometry in Computer Vision, 2nd edn. Cambridge University Press, New York (2003)

    MATH  Google Scholar 

  20. Hartley, R.I., Zisserman, A.: Multiple View Geometry in Computer Vision, 2nd edn. Cambridge University Press, Cambridge (2004). ISBN: 0521540518

    Book  Google Scholar 

  21. Hennessey, C.A., Lawrence, P.D.: Improving the accuracy and reliability of remote system-calibration-free eye-gaze tracking. IEEE Trans. Biomed. Eng. 56(7), 1891–900 (2009)

    Article  Google Scholar 

  22. Hennessey, C.A., Lawrence, P.D.: Improving the accuracy and reliability of remote system-calibration-free eye-gaze tracking. IEEE Trans. Biomed. Eng. 56(7), 1891–1900 (2009). https://doi.org/10.1109/TBME.2009.2015955

    Article  Google Scholar 

  23. Ji, Q., Yang, X.: Real-time eye, gaze, and face pose tracking for monitoring driver vigilance. Real-Time Imaging 8(5), 357–377 (2002)

    Article  Google Scholar 

  24. Kang, J.J., Guestrin, E.D., Eizenman, E.: Investigation of the cross-ratio method for point-of-gaze estimation. Trans. Biomed. Eng. 55(9), 2293–302 (2008)

    Article  Google Scholar 

  25. Kang, J.J., Guestrin, E.D., Maclean, W.J., Eizenman, M.: Simplifying the cross-ratios method of point-of-gaze estimation. In: 30th Canadian Medical and Biological Engineering Conference (CMBEC30) (2007)

  26. Kar, A., Corcoran, P.: A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. IEEE Access 5, 16495–16519 (2017). https://doi.org/10.1109/ACCESS.2017.2735633

    Article  Google Scholar 

  27. Koutras, P., Maragos, P.: Estimation of eye gaze direction angles based on active appearance models. In: 2015 IEEE International Conference on Image Processing (ICIP), pp. 2424–2428 (2015). https://doi.org/10.1109/ICIP.2015.7351237

  28. Krafka, K., Khosla, A., Kellnhofer, P., Kannan, H., Bhandarkar, S., Matusik, W., Torralba, A.: Eye tracking for everyone. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2016)

  29. Kurauchi, A.T.N., Feng, W., Joshi, A., Morimoto, C.H., Betke, M.: Eyeswipe: Dwell-free text entry using gaze paths. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI ’16, pp. 1952–1956. ACM, New York, NY, USA (2016). https://doi.org/10.1145/2858036.2858335

  30. Lai, C., Shih, S., Hung, Y.: Hybrid method for 3-d gaze tracking using glint and contour features. IEEE Trans. Circuits Syst. Video Technol. 25(1), 24–37 (2015). https://doi.org/10.1109/TCSVT.2014.2329362

    Article  Google Scholar 

  31. Lander, C., Gehring, S., Krüger, A., Boring, S., Bulling, A.: Gazeprojector: Accurate gaze estimation and seamless gaze interaction across multiple displays. In: Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, UIST 2015, Charlotte, NC, USA, November 8–11, 2015, pp. 395–404 (2015). https://doi.org/10.1145/2807442.2807479

  32. Li, F., Munn, S., Pelz, J.: A model-based approach to videobased eye tracking. J. Mod. Opt. 55(4–5), 503–531 (2008). https://doi.org/10.1080/09500340701467827

    Article  Google Scholar 

  33. Ma, C., Choi, K.A., Choi, B.D., Ko, S.J.: Robust remote gaze estimation method based on multiple geometric transforms. Opt. Eng. 54(8), 083103 (2015). https://doi.org/10.1117/1.OE.54.8.083103

    Article  Google Scholar 

  34. Ma, Z., Liu, Z., Ho, M., Yen, J., Chen, Y.: Long range gaze estimation with multiple near-infrared emitters. In: 2017 International Automatic Control Conference (CACS), pp. 1–5 (2017). https://doi.org/10.1109/CACS.2017.8284270

  35. Meyerding, S.G., Merz, N.: Consumer preferences for organic labels in Germany using the example of apples–combining choice-based conjoint analysis and eye-tracking measurements. J. Clean. Prod. 181, 772–783 (2018). https://doi.org/10.1016/j.jclepro.2018.01.235

    Article  Google Scholar 

  36. Morimoto, C.H., Koons, D., Amir, A., Flickner, M.: Pupil detection and tracking using multiple light sources. Image Vis. Comput. 18(4), 331–335 (2000)

    Article  Google Scholar 

  37. Morimoto, C.H., Leyva, J.A.T., Diaz-Tula, A.: Context switching eye typing using dynamic expanding targets. In: Proceedings of the Workshop on Communication by Gaze Interaction, COGAIN ’18, pp. 1–9. ACM, New York, NY, USA (2018). https://doi.org/10.1145/3206343.3206347

  38. Morimoto, C.H., Mimica, M.: Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 98(1), 4–24 (2005)

    Article  Google Scholar 

  39. Nguyen, C., Liu, F.: Gaze-based notetaking for learning from lecture videos. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI ’16, pp. 2093–2097. ACM, New York, NY, USA (2016). https://doi.org/10.1145/2858036.2858137

  40. Noureddin, B., Lawrence, P., Man, C.: A non-contact device for tracking gaze in a human computer interface. Comput. Vis. Image Underst. 98(1), 52–82 (2005)

    Article  Google Scholar 

  41. Nyström, M., Andersson, R., Holmqvist, K., van de Weijer, J.: The influence of calibration method and eye physiology on eyetracking data quality. Behav. Res. Methods 45(1), 272–288 (2013). https://doi.org/10.3758/s13428-012-0247-4

    Article  Google Scholar 

  42. Ramirez Gomez, A., Gellersen, H.: Looking outside the box: reflecting on gaze interaction in gameplay. In: Proceedings of the Annual Symposium on Computer-Human Interaction in Play, CHI PLAY ’19, pp. 625–637. ACM, New York, NY, USA (2019). https://doi.org/10.1145/3311350.3347150

  43. Santini, T., Fuhl, W., Kasneci, E.: Calibme: Fast and unsupervised eye tracker calibration for gaze-based pervasive human-computer interaction. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI ’17, pp. 2594–2605. ACM, New York, NY, USA (2017). https://doi.org/10.1145/3025453.3025950

  44. Schnipke, S., Todd, M.: Trials and tribulations of using an eye-tracking system. In: Proceedings of the 2000 CHI Conference on Human Factors in Computing Systems, CHI’2000. ACM (2000)

  45. Shih, S.W., Wu, Y.T., Liu, J.: A calibration-free gaze tracking technique. In: Proceedings of the 15th International Conference on Pattern Recognition, pp. 201–204 (2000)

  46. Tula, A.D., Morimoto, C.H.: Augkey: Increasing foveal throughput in eye typing with augmented keys. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI ’16, pp. 3533–3544. ACM, New York, NY, USA (2016). https://doi.org/10.1145/2858036.2858517

  47. Wang, K., Ji, Q.: Real time eye gaze tracking with 3d deformable eye-face model. In: 2017 IEEE International Conference on Computer Vision (ICCV), pp. 1003–1011 (2017). https://doi.org/10.1109/ICCV.2017.114

  48. Wang, Y., Shen, T., Yuan, G., Bian, J., Fu, X.: Appearance-based gaze estimation using deep features and random forest regression. Knowl. Based Syst. 110(1), 293–301 (2016). https://doi.org/10.1016/j.knosys.2016.07.038

    Article  Google Scholar 

  49. Wood, E., Baltruaitis, T., Zhang, X., Sugano, Y., Robinson, P., Bulling, A.: Rendering of eyes for eye-shape registration and gaze estimation. In: 2015 IEEE International Conference on Computer Vision (ICCV), pp. 3756–3764 (2015). https://doi.org/10.1109/ICCV.2015.428

  50. Yoo, D.H., Chung, M.J.: A novel non-intrusive eye gaze estimation using cross-ratio under large head motion. Comput. Vis. Image Underst. 98(1), 25–51 (2005)

    Article  Google Scholar 

  51. Zhang, X., Sugano, Y., Fritz, M., Bulling, A.: Appearance-based gaze estimation in the wild. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2015)

  52. Zhang, X., Sugano, Y., Fritz, M., Bulling, A.: MPIIGaze: real-world dataset and deep appearance-based gaze estimation. IEEE Trans. Pattern Anal. Mach. Intell. 41(1), 162–175 (2019). https://doi.org/10.1109/TPAMI.2017.2778103

    Article  Google Scholar 

  53. Zhang, Y., Pfeuffer, K., Chong, M.K., Alexander, J., Bulling, A., Gellersen, H.: Look together: using gaze for assisting co-located collaborative search. Pers. Ubiquitous Comput. 21(1), 173–186 (2017). https://doi.org/10.1007/s00779-016-0969-x

    Article  Google Scholar 

Download references

Acknowledgements

This research was supported by Grant 2011/00267-1 and 2016/10148-2 from the São Paulo Research Foundation (FAPESP).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Carlos H. Morimoto.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Morimoto, C.H., Coutinho, F.L. & Hansen, D.W. Screen-Light Decomposition Framework for Point-of-Gaze Estimation Using a Single Uncalibrated Camera and Multiple Light Sources. J Math Imaging Vis 62, 585–605 (2020). https://doi.org/10.1007/s10851-020-00947-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10851-020-00947-8

Keywords

Navigation