skip to main content
10.1145/3025453.3025794acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

ScreenGlint: Practical, In-situ Gaze Estimation on Smartphones

Published: 02 May 2017 Publication History

Abstract

Gaze estimation has widespread applications. However, little work has explored gaze estimation on smartphones, even though they are fast becoming ubiquitous. This paper presents ScreenGlint, a novel approach which exploits the glint (reflection) of the screen on the user's cornea for gaze estimation, using only the image captured by the front-facing camera. We first conduct a user study on common postures during smartphone use. We then design an experiment to evaluate the accuracy of ScreenGlint under varying face-to-screen distances. An in-depth evaluation involving multiple users is conducted and the impact of head pose variations is investigated. ScreenGlint achieves an overall angular error of 2.44º without head pose variations, and 2.94º with head pose variations. Our technique compares favorably to state-of-the-art research works, indicating that the glint of the screen is an effective and practical cue to gaze estimation on the smartphone platform. We believe that this work can open up new possibilities for practical and ubiquitous gaze-aware applications.

Supplementary Material

suppl.mov (pn2672-file3.mp4)
Supplemental video
suppl.mov (pn2672p.mp4)
Supplemental video

References

[1]
Duchowski, A.T. Eye Tracking Methodology: Theory and Practice. Springer London, London, 2003. https://doi.org/10.1007/978--1--84628--609--4
[2]
Esteves, A., Velloso, E., Bulling, A., and Gellersen, H. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements. Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology - UIST '15, (2015), 457--466. http://doi.acm.org/10.1145/2807442.2807499
[3]
Fischler, M. a. and Bolles, R.C. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM 24, (1981), 381--395. http://doi.acm.org/10.1145/358669.358692
[4]
Fitzgibbon, A.W. and Fisher, R.B. A Buyer's Guide to Conic Fitting. British Machine Vision Conference, (1995), 513--522. https://doi.org/10.5244/C.9.51
[5]
Guestrin, E.D. and Eizenman, M. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on Biomedical Engineering 53, (2006), 1124--1133. http://doi.acm.org/10.1109/TBME.2005.863952
[6]
Hansen, D.W. and Ji, Q. In the eye of the beholder: a survey of models for eyes and gaze. IEEE transactions on pattern analysis and machine intelligence 32, 3 (2010), 478--500. http://doi.acm.org/10.1109/TPAMI.2009.30
[7]
Hennessey, C., Noureddin, B., and Lawrence, P. A single camera eye-gaze tracking system with free head motion. Proceedings of the 2006 symposium on Eye tracking research & applications, (2006), 27--29. http://doi.acm.org/10.1145/1117309.1117349
[8]
Hohlfeld, O., Pomp, A., Bitsch Link, J.Á., and Guse, D. On the Applicability of Computer Vision based Gaze Tracking in Mobile Scenarios. Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services MobileHCI'15, (2015), 427--434. http://dx.doi.org/10.1145/2785830.2785869
[9]
Holland, C., Garza, A., Kurtova, E., Cruz, J., and Komogortsev, O. Usability evaluation of eye tracking on an unmodified common tablet. CHI '13 Extended Abstracts on Human Factors in Computing Systems on CHI EA"13, (2013), 295--300. http://doi.acm.org/10.1145/2468356.2468409
[10]
Huang, M.X., Kwok, T.C.K., Ngai, G., Chan, S.C.F., and Leong, H.V. Building a Personalized, AutoCalibrating Eye Tracker from User Interactions. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, (2016), 5169--5179. http://doi.acm.org/10.1145/2858036.2858404
[11]
Huang, Q., Veeraraghavan, A., and Sabharwal, A. TabletGaze: Unconstrained Appearance-based Gaze Estimation in Mobile Tablets. arXiv, (2015). https://arxiv.org/abs/1508.01244
[12]
Iqbal, N. and Lee, S. A Study on Human Gaze Estimation Using Screen. Proceedings of 9th International Conference Intelligent Data Engineering and Automated Learning -- IDEAL 2008, (2008), 104-- 111. https://doi.org/10.1007/978--3--540--88906--9_14
[13]
Krafka, K., Khosla, A., Kellnhofer, P., and Kannan, H. Eye Tracking for Everyone. IEEE Conference on Computer Vision and Pattern Recognition, (2016), 2176--2184. https://doi.org/10.1109/CVPR.2016.239
[14]
Lu, F., Sugano, Y., Okabe, T., and Sato, Y. Gaze Estimation From Eye Appearance: A Head Pose-Free Method via Eye Image Synthesis. IEEE Transactions on Image Processing 24, (2015), 3680--3693. https://doi.org/10.1109/TIP.2015.2445295
[15]
Nitschke, C., Nakazawa, A., and Takemura, H. Display-camera calibration using eye reflections and geometry constraints. Computer Vision and Image Understanding 115, (2011), 835--853. https://doi.org/10.1016/j.cviu.2011.02.008
[16]
Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., and Hays, J. WebGazer: Scalable Webcam Eye Tracking Using User Interactions. Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI), (2016), to appear.
[17]
Paul Viola, M.J. Robust Real-time Object Detection. International Journal of Computer Vision, (2001). https://doi.org/10.1023/B:VISI.0000013087.49260.fb
[18]
Rother, C., Kolmogorov, V., and Blake, A. "GrabCut?:" interactive foreground extraction using iterated graph cuts. ACM Transactions on Graphics 23, (2004), 309-- 314. https://doi.org/10.1145/1015706.1015720
[19]
Rozado, D., Moreno, T., San Agustin, J., Rodriguez, F.B., and Varona, P. Controlling a Smartphone Using Gaze Gestures as the Input Mechanism. Human-- Computer Interaction 30, (2015), 34--63. https://doi.org/10.1080/07370024.2013.870385
[20]
Schnieders, D., Fu, X., and Wong, K.Y.K. Reconstruction of display and eyes from a single image. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2010), 1442--1449. https://doi.org/10.1109/CVPR.2010.5539799
[21]
Seeger, M. Gaussian Processes for Machine Learning. International Journal of Neural Systems 14, (2004), 69--106. https://doi.org/10.1142/S0129065704001899
[22]
Sesma-Sanchez, L., Zhang, Y., Bulling, A., and Gellersen, H. Gaussian processes as an alternative to polynomial gaze estimation functions. Proceedings of the 2016 Symposium on Eye Tracking Research & Applications, (2016), 229--232. https://doi.org/10.1145/2857491.2857509
[23]
Sugano, Y., Matsushita, Y., Sato, Y., and Koike, H. Appearance-Based Gaze Estimation With Online Calibration From Mouse Operations. IEEE Transactions on Human-Machine Systems 45, 6 (2015), 750 -- 760. https://doi.org/10.1109/THMS.2015.2400434
[24]
Sugano, Y., Matsushita, Y., and Sato, Y. Appearancebased gaze estimation using visual saliency. IEEE transactions on pattern analysis and machine intelligence 35, 2 (2013), 329--341. https://doi.org/10.1109/TPAMI.2012.101
[25]
Swirski, L., Bulling, A., and Dodgson, N. Robust realtime pupil tracking in highly off-axis images. Proceedings of the Symposium on Eye Tracking Research and Applications ETRA'12, (2012), 1--4. https://doi.org/10.1145/2168556.2168585
[26]
Vidal, M., Bulling, A., and Gellersen, H. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing - UbiComp '13, (2013), 439--448. https://doi.org/10.1145/2493432.2493477
[27]
Villanueva, A. and Cabeza, R. Models for gaze tracking systems. Eurasip Journal on Image and Video Processing 2007, (2007), 2:1--2:16. https://doi.org/10.1155/2007/23570
[28]
Wang, K., Wang, S., and Ji, Q. Deep eye fixation map learning for calibration-free eye gaze tracking. Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications - ETRA '16, (2016), 47--55. https://doi.org/10.1145/2857491.2857515
[29]
Wood, E., Baltrusaitis, T., Morency, L.-P., Robinson, P., and Bulling, A. Learning an appearance-based gaze estimator from one million synthesised images. Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications - ETRA '16, (2016), 131--138. https://doi.org/10.1145/2857491.2857492
[30]
Wood, E. and Bulling, A. EyeTab: Model-based gaze estimation on unmodified tablet computers. Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA '14, ACM Press (2014), 207--210. https://doi.org/10.1145/2578153.2578185
[31]
Xiong, X. and De la Torre, F. Supervised Descent Method and Its Applications to Face Alignment. 2013 IEEE Conference on Computer Vision and Pattern Recognition, IEEE (2013), 532--539. https://doi.org/10.1109/CVPR.2013.75
[32]
Xu, P., Ehinger, K., and Zhang, Y. TurkerGaze: Crowdsourcing Saliency with Webcam based Eye Tracking. arXiv preprint arXiv:, (2015). https://arxiv.org/abs/arXiv:1504.06755
[33]
Zhai, S., Morimoto, C., and Ihde, S. Manual and gaze input cascaded (MAGIC) pointing. Proceedings of the SIGCHI conference on Human factors in computing systems the CHI is the limit CHI 99, (1999), 246--253. https://doi.org/10.1145/302979.303053
[34]
Zhang, X., Sugano, Y., Fritz, M., and Bulling, A. Appearance-Based Gaze Estimation in the Wild. 28th IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2015), (2015), 4511--4520. https://doi.org/10.1109/CVPR.2015.7299081
[35]
Zhang, X., Sugano, Y., Fritz, M., and Bulling, A. It's Written All Over Your Face: Full-Face AppearanceBased Gaze Estimation. (2016). https://arxiv.org/abs/1611.08860

Cited By

View all
  • (2024)Gaze-Swin: Enhancing Gaze Estimation with a Hybrid CNN-Transformer Network and Dropkey MechanismElectronics10.3390/electronics1302032813:2(328)Online publication date: 12-Jan-2024
  • (2024)CalibRead: Unobtrusive Eye Tracking Calibration from Natural Reading BehaviorProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997378:4(1-30)Online publication date: 21-Nov-2024
  • (2024)Enhancing Readability with a Target-Aware Zooming Technique for Touch SurfacesAdjunct Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3672539.3686329(1-3)Online publication date: 13-Oct-2024
  • Show More Cited By

Index Terms

  1. ScreenGlint: Practical, In-situ Gaze Estimation on Smartphones

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '17: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems
      May 2017
      7138 pages
      ISBN:9781450346559
      DOI:10.1145/3025453
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 02 May 2017

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. gaze estimation
      2. glint
      3. mobile eye tracker
      4. screen reflection

      Qualifiers

      • Research-article

      Funding Sources

      • Hong Kong Research Grants Council

      Conference

      CHI '17
      Sponsor:

      Acceptance Rates

      CHI '17 Paper Acceptance Rate 600 of 2,400 submissions, 25%;
      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Upcoming Conference

      CHI 2025
      ACM CHI Conference on Human Factors in Computing Systems
      April 26 - May 1, 2025
      Yokohama , Japan

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)113
      • Downloads (Last 6 weeks)16
      Reflects downloads up to 13 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Gaze-Swin: Enhancing Gaze Estimation with a Hybrid CNN-Transformer Network and Dropkey MechanismElectronics10.3390/electronics1302032813:2(328)Online publication date: 12-Jan-2024
      • (2024)CalibRead: Unobtrusive Eye Tracking Calibration from Natural Reading BehaviorProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997378:4(1-30)Online publication date: 21-Nov-2024
      • (2024)Enhancing Readability with a Target-Aware Zooming Technique for Touch SurfacesAdjunct Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3672539.3686329(1-3)Online publication date: 13-Oct-2024
      • (2024)PrivatEyes: Appearance-based Gaze Estimation Using Federated Secure Multi-Party ComputationProceedings of the ACM on Human-Computer Interaction10.1145/36556068:ETRA(1-23)Online publication date: 28-May-2024
      • (2024)ClearDepth: Addressing Depth Distortions Caused By Eyelashes For Accurate Geometric Gaze Estimation On Mobile Devices2024 IEEE International Conference on Image Processing (ICIP)10.1109/ICIP51287.2024.10647998(2135-2141)Online publication date: 27-Oct-2024
      • (2023)Person-Specific Gaze Estimation from Low-Quality Webcam ImagesSensors10.3390/s2308413823:8(4138)Online publication date: 20-Apr-2023
      • (2023)BoT2L-Net: Appearance-Based Gaze Estimation Using Bottleneck Transformer Block and Two Identical Losses in Unconstrained EnvironmentsElectronics10.3390/electronics1207170412:7(1704)Online publication date: 4-Apr-2023
      • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
      • (2023)Gaze-based Interaction on Handheld Mobile DevicesProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589540(1-4)Online publication date: 30-May-2023
      • (2023)GlanceWriter: Writing Text by Glancing Over Letters with GazeProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581269(1-13)Online publication date: 19-Apr-2023
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media