Skip to main content
Log in

xSDL: stroboscopic differential lighting eye tracker with extended temporal support

  • Original Paper
  • Published:
Machine Vision and Applications Aims and scope Submit manuscript

A Correction to this article was published on 22 May 2019

This article has been updated

Abstract

Eye tracking (ET) for gaze interaction in wearable computing imposes harder constraints on computational efficiency and illumination conditions than remote ET. In this paper we present xSDL, an extended temporal support computer vision algorithm for accurate, robust, and efficient pupil detection and gaze estimation. The robustness and efficiency of xSDL partly come from the use of stroboscopic differential lighting (SDL), an extension of the differential lighting pupil detection technique developed in the 90’s. Due to the erratic behavior of eye movements, traditional computer vision tracking techniques (such as Kalman filters) do not perform well, so most ET techniques simply detect some eye feature (such as the pupil center) at every frame. Extended temporal support uses keyframes selected during eye fixations and a simple translation model of the pupil to further improve the computational performance of SDL. A prototype composed of two independent acquisition systems was developed to evaluate the performance of xSDL and other four state-of-the-art ET techniques under similar conditions. Our results show that xSDL outperforms those four algorithms, both in speed (close to 2000 Hz using 240 line frames) and accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18

Similar content being viewed by others

Change history

  • 22 May 2019

    Unfortunately, Fig. 9 was incorrectly published in the online version. Correct figure is updated here.

References

  1. Amir, A., Zimet, L., Sangiovanni-Vincentelli, A., Kao, S.: An embedded system for an eye-detection sensor. Comput. Vis. Image Underst. 98(1), 104–123 (2005). https://doi.org/10.1016/j.cviu.2004.07.009

    Article  Google Scholar 

  2. Borsato, F., Aluani, F., Morimoto, C.: A fast and accurate eye tracker using stroboscopic differential lighting. In: Proceedings of the IEEE International Conference on Computer Vision Workshops, pp. 110–118 (2015)

  3. Borsato, F.H., Morimoto, C.H.: Building structured lighting applications using low-cost cameras. In: 2017 30th SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI), pp. 15–22 (2017). https://doi.org/10.1109/SIBGRAPI.2017.9

  4. Diaz-Tula, A., Morimoto, C.H.: AugKey: increasing foveal throughput in eye typing with augmented keys. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI ’16, pp. 3533–3544. ACM, New York (2016).https://doi.org/10.1145/2858036.2858517

  5. Diaz-Tula, A., Morimoto, C.H., Ranvaud, R.D.: A mathematical model of saccadic reaction time as a function of the fixation point brightness gain. Atten. Percept. Psychophys. 77(6), 2153–2165 (2015). https://doi.org/10.3758/s13414-015-0902-9

    Article  Google Scholar 

  6. Dvornychenko, V.: Bounds on (deterministic) correlation functions with application to registration. IEEE Trans. Pattern Anal. Mach. Intell. 5(2), 206–213 (1983)

    Article  MATH  Google Scholar 

  7. Ebisawa, Y.: Improved video-based eye-gaze detection method. IEEE Trans. Instrum. Meas. 47(4), 948–955 (1998). https://doi.org/10.1109/19.744648

    Article  Google Scholar 

  8. Ebisawa, Y., Satoh, S.: Effectiveness of pupil area detection technique using two light sources and image difference method. In: Proceedings of the 15th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 1268–1269 (1993). https://doi.org/10.1109/IEMBS.1993.979129

  9. Fitzgibbon, A., Pilu, M., Fisher, R.B.: Direct least square fitting of ellipses. IEEE Trans. Pattern Anal. Mach. Intell. 21(5), 476–480 (1999)

    Article  Google Scholar 

  10. Fuhl, W., Kübler, T., Sippel, K., Rosenstiel, W., Kasneci, E.: Excuse: robust pupil detection in real-world scenarios. In: Azzopardi, G., Petkov, N. (eds.) Computer Analysis of Images and Patterns: 16th International Conference, CAIP 2015, Valletta, Malta, pp. 39–51. Springer (2015). https://doi.org/10.1007/978-3-319-23192-1_4

  11. Fuhl, W., Santini, T.C., Kübler, T., Kasneci, E.: ElSe: ellipse selection for robust pupil detection in real-world environments. In: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, ETRA ’16, pp. 123–130. ACM, New York (2016). https://doi.org/10.1145/2857491.2857505

  12. Fuhl, W., Tonsen, M., Bulling, A., Kasneci, E.: Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art. Mach. Vis. Appl. 27(8), 1275–1288 (2016). https://doi.org/10.1007/s00138-016-0776-4

    Article  Google Scholar 

  13. George, A., Routray, A.: Fast and accurate algorithm for eye localisation for gaze tracking in low-resolution images. IET Comput. Vis. 10(7), 660–669 (2016). https://doi.org/10.1049/iet-cvi.2015.0316

    Article  Google Scholar 

  14. Hansen, D.W., Ji, Q.: In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans. Pattern Anal. Mach. Intell. 32(3), 478–500 (2010). https://doi.org/10.1109/TPAMI.2009.30

    Article  Google Scholar 

  15. Haro, A., Flickner, M., Essa, I.: Detecting and tracking eyes by using their physiological properties, dynamics, and appearance. In: Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No. PR00662), vol. 1, pp. 163–168 (2000). https://doi.org/10.1109/CVPR.2000.855815

  16. Hennessey, C., Noureddin, B., Lawrence, P.: A single camera eye-gaze tracking system with free head motion. In: Proceedings of the 2006 Symposium on Eye Tracking Research & Applications, pp. 87–94 (2006)

  17. Jacob, R.J.K., Karn, K.S.: Eye tracking in human–computer interaction and usability research: ready to deliver the promises. In: The Mind’s Eye: Cognitive and Applied Aspects of Eye Movement Research, pp. 573–603 (2003). https://doi.org/10.1016/B978-044451020-4/50031-1

  18. Ji, Q., Yang, X.: Real-time eye, gaze, and face pose tracking for monitoring driver vigilance. Real-Time Imaging 8(5), 357–377 (2002). https://doi.org/10.1006/rtim.2002.0279

    Article  MATH  Google Scholar 

  19. Kapoor, A., Picard, R.W.: A real-time head nod and shake detector. In: Proceedings of the 2001 Workshop on Perceptive User Interfaces, PUI ’01, pp. 1–5. ACM, New York (2001). https://doi.org/10.1145/971478.971509

  20. Kassner, M., Patera, W., Bulling, A.: Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, pp. 1151–1160. ACM (2014)

  21. Kempinski, Y.: System and method of diagnosis using gaze and eye tracking (2016). US Patent App. 14/723,590. https://www.google.com/patents/US20160106315. Accessed 14 Nov 2017

  22. Li, D., Parkhurst, D.J.: Starburst: A Robust Algorithm for Video-Based Eye Tracking, p. 6. Elsevier, Amsterdam (2005)

    Google Scholar 

  23. Menon, R.V., Sigurdsson, V., Larsen, N.M., Fagerstrøm, A., Foxall, G.R.: Consumer attention to price in social commerce: Eye tracking patterns in retail clothing. J. Bus. Res. 69(11), 5008–5013 (2016). https://doi.org/10.1016/j.jbusres.2016.04.072

    Article  Google Scholar 

  24. Morimoto, C., Flickner, M.: Real-time multiple face detection using active illumination. In: Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580), Grenoble, France, pp. 8–13 (2000). https://doi.org/10.1109/AFGR.2000.840605

  25. Morimoto, C., Koons, D., Amit, A., Flickner, M., Zhai, S.: Keeping an eye for HCI. In: Proceedings. XII Brazilian Symposium on Computer Graphics and Image Processing, 1999, pp. 171–176 (1999). https://doi.org/10.1109/SIBGRA.1999.805722

  26. Morimoto, C.H., Koons, D., Amir, A., Flickner, M.: Pupil detection and tracking using multiple light sources. Image Vis. Comput. 18(4), 331–335 (2000)

    Article  Google Scholar 

  27. Morimoto, C.H., Koons, D., Amir, A., Flickner, M.D.: Pupil detection and tracking using multiple light sources. Image Vis. Comput. 18(4), 331–335 (2000)

    Article  Google Scholar 

  28. Morimoto, C.H., Mimica, M.R.: Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 98(1), 4–24 (2005). https://doi.org/10.1016/j.cviu.2004.07.010

    Article  Google Scholar 

  29. Ohno, T., Mukawa, N., Yoshikawa, A.: FreeGaze: a gaze tracking system for everyday gaze interaction. In: Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, pp. 125–132 (2002)

  30. Oliveira, D., Machín, L., Deliza, R., Rosenthal, A., Walter, E.H., Giménez, A., Ares, G.: Consumers’ attention to functional food labels: insights from eye-tracking and change detection in a case study with probiotic milk. LWT Food Sci. Technol. 68, 160–167 (2016). https://doi.org/10.1016/j.lwt.2015.11.066

    Article  Google Scholar 

  31. Oyster, C.: The Human Eye: Structure and Function. Sinauer Associates, Sunderland (1999)

  32. Piumsomboon, T., Lee, G., Lindeman, R.W., Billinghurst, M.: Exploring natural eye-gaze-based interaction for immersive virtual reality. In: 2017 IEEE Symposium on 3D User Interfaces (3DUI), pp. 36–39 (2017). https://doi.org/10.1109/3DUI.2017.7893315

  33. Samadani, U., Zahid, A.B., Lockyer, J., Dammavalam, V., Grady, M., Nance, M., Scheiman, M., Master, C.L.: Eye tracking a biomarker for concussion in the paediatricpaediatric population. Brit. J. Sports Med. 51(11), A5 (2017)

    Article  Google Scholar 

  34. Santini, T., Fuhl, W., Kasneci, E.: Purest: robust pupil tracking for real-time pervasive eye tracking. In: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, ETRA ’18, pp. 61:1–61:5. ACM, New York (2018). https://doi.org/10.1145/3204493.3204578

  35. Soler-Dominguez, J.L., Camba, J.D., Contero, M., Alcañiz, M.: A proposal for the selection of eye-tracking metrics for the implementation of adaptive gameplay in virtual reality based games. In: Lackey, S., Chen, J. (eds.) Virtual, Augmented and Mixed Reality: 9th International Conference, VAMR 2017, pp. 369–380. Springer, Vancouver (2017). https://doi.org/10.1007/978-3-319-57987-0_30

    Chapter  Google Scholar 

  36. Sony, C.: PlayStation Eye Camera [Online 14 Jan 2014] (2014). http://us.playstation.com/ps3/accessories/playstation-eye-camera-ps3.html

  37. Świrski, L., Bulling, A., Dodgson, N.: Robust real-time pupil tracking in highly off-axis images. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 173–176. ACM (2012)

  38. Zhu, Z., Ji, Q.: Robust real-time eye detection and tracking under variable lighting conditions and various face orientations. Comput. Vis. Image Underst. 98(1), 124–154 (2005). https://doi.org/10.1016/j.cviu.2004.07.012

    Article  Google Scholar 

Download references

Acknowledgements

This work has been supported by the Fundação Araucária (DINTER Project UTFPR/IME-USP) and FAPESP Grant Number 2012/04426-0, 2016/0446-2, 2016/10148-3 and by the Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - Brasil (CAPES) - Finance Code 001.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Frank H. Borsato.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

The original version of this article was revised: Figure 9 was incorrectly published in the online version.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Borsato, F.H., Diaz-Tula, A. & Morimoto, C.H. xSDL: stroboscopic differential lighting eye tracker with extended temporal support. Machine Vision and Applications 30, 689–703 (2019). https://doi.org/10.1007/s00138-019-01022-y

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00138-019-01022-y

Keywords

Navigation