Skip to main content
Log in

CamType: assistive text entry using gaze with an off-the-shelf webcam

  • Original Paper
  • Published:
Machine Vision and Applications Aims and scope Submit manuscript

Abstract

As modern assistive technology advances, eye-based text entry systems have been developed to help a subset of physically challenged people to improve their communication ability. However, speed of text entry in early eye-typing system tends to be relatively slow due to dwell time. Recently, dwell-free methods have been proposed which outperform the dwell-based systems in terms of speed and resilience, but the extra eye-tracking device is still an indispensable equipment. In this article, we propose a prototype of eye-typing system using an off-the-shelf webcam without the extra eye tracker, in which the appearance-based method is proposed to estimate people’s gaze coordinates on the screen based on the frontal face images captured by the webcam. We also investigate some critical issues of the appearance-based method, which helps to improve the estimation accuracy and reduce computing complexity in practice. The performance evaluation shows that eye typing with webcam using the proposed method is comparable to the eye tracker under a small degree of head movement.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18

Similar content being viewed by others

References

  1. Adjouadi, M., Sesin, A., Ayala, M., Cabrerizo, M.: Remote eye gaze tracking system as a computer interface for persons with severe motor disability. Springer, Berlin (2004)

    Book  Google Scholar 

  2. Baltru, T., Robinson, P., Morency, L.-P., et al.: Openface: an open source facial behavior analysis toolkit. In: 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), pp. 1–10 (2016)

  3. Brolly, X. L., Mulligan, J. B.: Implicit calibration of a remote gaze tracker. In: Conference on Computer Vision and Pattern Recognition Workshop, 2004, CVPRW’04, p. 134 (2004)

  4. Caligari, M., Godi, M., Guglielmetti, S., Franchignoni, F., Nardone, A.: Eye tracking communication devices in amyotrophic lateral sclerosis: impact on disability and quality of life. Amyotroph. Lateral Scler. Frontotemporal Degener. 14(7–8), 546–552 (2013)

    Article  Google Scholar 

  5. Cerrolaza, J.J., Villanueva, A., Cabeza, R.: Taxonomic study of polynomial regressions applied to the calibration of video-oculographic systems. In: Proceedings of the 2008 Symposium on Eye Tracking Research and Applications, pp. 259–266 (2008)

  6. Chen, J., Ji, Q.: Probabilistic gaze estimation without active personal calibration. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 609-616 (2011)

  7. Davies, M.: Word Frequency Data from the Corpus of Contemporary American English (COCA) (2011). Retrieved from https://www.wordfrequency.info/

  8. Ebisawa, Y., Satoh, S.-I.: Effectiveness of pupil area detection technique using two light sources and image difference method. In: Proceedings of the 15th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 1268–1269 (1993)

  9. Hansen, D.W., Ji, Q.: In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans. Pattern Anal. Mach. Intell. 32(3), 478–500 (2010)

    Article  Google Scholar 

  10. Hoppe, S., Löchtefeld, M., Daiber, F.: Eype—using eye-traces for eye-typing. In: Workshop on Grand Challenges in Text Entry (CHI 2013) (2013)

  11. Huey, E.B.: The Psychology and Pedagogy of Reading. Macmillan, New York (1908)

    Google Scholar 

  12. Jacob, R., Karn, K.S.: Eye tracking in human–computer interaction and usability research: ready to deliver the promises. Mind 2(3), 4 (2003)

    Google Scholar 

  13. Kim, S.-T., Choi, K.-A., Shin, Y.-G., Ko, S.-J.: A novel iris center localization based on circle fitting using radially sampled features. In: 2015 IEEE International Symposium on Consumer Electronics (ISCE), pp. 1–2 (2015)

  14. Kocejko, T., Bujnowski, A., Wtorek, J.: Eye-mouse for disabled. In: Hippe, Z.S., Kulikowski, J.L. (eds.) Human–Computer Systems Interaction, pp. 109–122. Springer, Berlin (2009)

    Chapter  Google Scholar 

  15. Kotani, K., Yamaguchi, Y., Asao, T., Horii, K.: Design of eye-typing interface using saccadic latency of eye movement. Int. J. Hum. Comput. Interaction 26(4), 361–376 (2010)

    Article  Google Scholar 

  16. Krafka, K., Khosla, A., Kellnhofer, P., Kannan, H., Bhandarkar, S., Matusik, W., Torralba, A.: Eye tracking for everyone. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2176–2184 (2016)

  17. Kristensson, P.O., Vertanen, K.: The potential of dwell-free eye-typing for fast assistive gaze communication. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 241–244 (2012)

  18. Kristensson, P.-O., Zhai, S.: Shark 2: a large vocabulary shorthand writing system for pen-based computers. In: Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology, pp. 43–52 (2004)

  19. Liu, Y., Lee, B.-S., McKeown, M., Lee, C.: A robust recognition approach in eye-based dwell-free typing. In: Proceedings of 2015 International Conference on Progress in Informatics and Computing, pp. 5–9 (2015)

  20. Liu, Y., Lee, B.-S., McKeown, M.J.: Robust eye-based dwell-free typing. Int. J. Hum. Comput. Interaction 32(9), 682–694 (2016). https://doi.org/10.1080/10447318.2016.1186307

    Article  Google Scholar 

  21. Liu, Y., Zhang, C., Lee, C., Lee, B.-S., Chen, A. Q.: Gazetry: swipe text typing using gaze. In: Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction, pp. 192–196 (2015)

  22. Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Inferring human gaze from appearance via adaptive linear regression. In: 2011 IEEE International Conference on Computer Vision (ICCV), pp. 153–160 (2011)

  23. Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Trans. Pattern Anal. Mach. Intell. 36(10), 2033–2046 (2014)

    Article  Google Scholar 

  24. MacKenzie, I.S., Zhang, X.: Eye typing using word and letter prediction and a fixation algorithm. In: Proceedings of the 2008 Symposium on Eye Tracking Research and Applications, pp. 55–58 (2008)

  25. Majaranta, P., Ahola, U.-K., Špakov, O.: Fast gaze typing with an adjustable dwell time. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 357–360 (2009)

  26. Majaranta, P., Räihä, K.-J.: Twenty years of eye typing: systems and design issues. In: Proceedings of the 2002 Symposium on Eye Tracking Research and Applications, pp. 15–22. ACM, New York (2002). https://doi.org/10.1145/507072.507076

  27. Morimoto, C.H., Mimica, M.R.: Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 98(1), 4–24 (2005)

    Article  Google Scholar 

  28. Murata, A.: Eye-gaze input versus mouse: cursor control as a function of age. Int. J. Hum. Comput. Interaction 21(1), 1–14 (2006)

    Google Scholar 

  29. Ohno, T.: Eyeprint: using passive eye trace from reading to enhance document access and comprehension. Int. J. Hum. Comput. Interaction 23(1–2), 71–94 (2007)

    Article  Google Scholar 

  30. Pedrosa, D., Pimentel, M.G., Truong, K.N.: Filteryedping: a dwell-free eye typing technique. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, pp. 303–306 (2015)

  31. Pedrosa, D., Pimentel, M.D.G., Wright, A., Truong, K.N.: Filteryedping: design challenges and user performance of dwell-free eye typing. ACM Trans. Access. Comput. 6(1), 3 (2015)

    Article  Google Scholar 

  32. Räihä, K.-J., Ovaska, S.: An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp. 3001–3010 (2012)

  33. Sarcar, S., Panwar, P., Chakraborty, T.: Eyek: an efficient dwell-free eye gaze-based text entry system. In: Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction, pp. 215–220 (2013)

  34. Sesma, L., Villanueva, A., Cabeza, R.: Evaluation of pupil center-eye corner vector for gaze estimation using a web cam. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 217–220. ACM, New York (2012). https://doi.org/10.1145/2168556.2168598

  35. Skodras, E., Fakotakis, N.: Precise localization of eye centers in low resolution color images. Image Vis. Comput. 36, 51–60 (2015)

    Article  Google Scholar 

  36. Spataro, R., Ciriacono, M., Manno, C., La Bella, V.: The eye-tracking computer device for communication in amyotrophic lateral sclerosis. Acta Neurol. Scand. 130(1), 40–45 (2014)

    Article  Google Scholar 

  37. Su, M.-C., Wang, K.-C., Chen, G.-D.: An eye tracking system and its application in aids for people with severe disabilities. Biomed. Eng. Appl. Basis Commun. 18(06), 319–327 (2006)

    Article  Google Scholar 

  38. Sugano, Y., Matsushita, Y., Sato, Y., Koike, H.: An incremental learning method for unconstrained gaze estimation. In: European Conference on Computer Vision, pp. 656–667 (2008)

  39. Tan, K.-H., Kriegman, D. J., Ahuja, N.: Appearance-based eye gaze estimation. In: Proceedings of Sixth IEEE Workshop on Applications Of Computer Vision, 2002 (WACV 2002), pp. 191–195 (2002)

  40. Urbina, M.H., Huckauf, A.: Alternatives to single character entry and dwell time selection on eye typing. In: Proceedings of the 2010 Symposium on Eye-tracking Research and Applications, pp. 315–322 (2010)

  41. Vadillo, M.A., Street, C.N., Beesley, T., Shanks, D.R.: A simple algorithm for the offline recalibration of eye-tracking data through best-fitting linear transformation. Behav. Res. Methods 47(4), 1365–1376 (2015)

    Article  Google Scholar 

  42. Valenti, R., Gevers, T.: Accurate eye center location through invariant isocentric patterns. IEEE Trans. Pattern Anal. Mach. Intell. 34(9), 1785–1798 (2012)

    Article  Google Scholar 

  43. Villanueva, A., Cabeza, R., Porta, S.: Gaze tracking system model based on physical parameters. Int. J. Pattern Recognit. Artif. Intell. 21(05), 855–877 (2007)

    Article  Google Scholar 

  44. Viola, P., Jones, M.J.: Robust real-time face detection. Int. J. Comput. Vis. 57(2), 137–154 (2004)

    Article  Google Scholar 

  45. Wang, J.-G., Sung, E., Venkateswarlu, R.: Estimating the eye gaze from one eye. Comput. Vis. Image Underst. 98(1), 83–103 (2005)

    Article  Google Scholar 

  46. Wang, P., Green, M. B., Ji, Q., Wayman, J.: Automatic eye detection and its validation. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition-workshops, 2005. CVPR Workshops, pp. 164 (2005)

  47. Ward, D.J., MacKay, D.J.: Artificial intelligence: fast hands-free writing by gaze direction. Nature 418, 838 (2002)

    Article  Google Scholar 

  48. Williams, O., Blake, A., Cipolla, R.: Sparse and semi-supervised visual mapping with the \(\text{s}^{\wedge }\)3GP. In: 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’06), vol. 1, pp. 230–237 (2006)

  49. Yu, P., Zhou, J., Wu, Y.: Learning reconstruction-based remote gaze estimation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3447–3455 (2016)

  50. Zander, T.O., Gaertner, M., Kothe, C., Vilimek, R.: Combining eye gaze input with a brain-computer interface for touchless human–computer interaction. Int. J. Hum. Comput. Interaction 27(1), 38–51 (2010)

    Article  Google Scholar 

  51. Zhang, X., Sugano, Y., Fritz, M., Bulling, A.: Appearance-based gaze estimation in the wild. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4511–4520 (2015)

  52. Zhang, Y., Hornof, A.J.: Easy post-hoc spatial recalibration of eye tracking data. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 95–98 (2014)

  53. Zhou, Z.-H., Geng, X.: Projection functions for eye detection. Pattern Recognit. 37(5), 1049–1056 (2004)

    Article  MATH  Google Scholar 

  54. Zhu, Z., Ji, Q.: Robust real-time eye detection and tracking under variable lighting conditions and various face orientations. Comput. Vis. Image Underst. 98(1), 124–154 (2005)

    Article  Google Scholar 

Download references

Acknowledgements

This work is a collaboration with the Joint NTU-UBC Research Centre of Excellence in Active Living for the Elderly (LILY).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yi Liu.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, Y., Lee, BS., Rajan, D. et al. CamType: assistive text entry using gaze with an off-the-shelf webcam. Machine Vision and Applications 30, 407–421 (2019). https://doi.org/10.1007/s00138-018-00997-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00138-018-00997-4

Keywords

Navigation