skip to main content
10.1145/3606305.3606326acmotherconferencesArticle/Chapter ViewAbstractPublication PagescompsystechConference Proceedingsconference-collections
research-article

A Study on Eye Tracking for Mobile Devices Using Deep Learning

Published:12 September 2023Publication History

ABSTRACT

While eye tracking technology has been around for several years, it has traditionally been implemented on personal computers using specific devices. Eye tracking through smartphones or tablets is much more challenging, because it involves the use of standard cameras and typically significantly fewer computational resources. In this paper, we present a study in which, based on a large dataset of face images acquired via mobile devices, we investigate the influence of some design choices (in particular, related to optimizers, color, and regularization techniques) on a deep learning convolutional architecture. We believe that the results obtained, although preliminary, can provide a useful contribution to a challenging and constantly evolving research field such as that of eye tracking for mobile appliances.

References

  1. Julien Adler. 2019. Mobile Device Gaze Estimation with Deep Learning: Using Siamese Neural Networks.Google ScholarGoogle Scholar
  2. Abien Fred Agarap. 2018. Deep learning using rectified linear units (relu). arXiv preprint arXiv:1803.08375 (2018).Google ScholarGoogle Scholar
  3. Andronicus A Akinyelu and Pieter Blignaut. 2022. Convolutional Neural Network-Based Technique for Gaze Estimation on Mobile Devices. Frontiers in Artificial Intelligence 4 (2022), 796825.Google ScholarGoogle ScholarCross RefCross Ref
  4. Francois Chollet 2015. Keras. https://github.com/fchollet/kerasGoogle ScholarGoogle Scholar
  5. Antonio Gulli and Sujit Pal. 2017. Deep learning with Keras. Packt Publishing Ltd.Google ScholarGoogle Scholar
  6. Nishan Gunawardena, Jeewani Anupama Ginige, and Bahman Javadi. 2022. Eye-Tracking Technologies in Mobile Devices Using Edge Computing: A Systematic Review. ACM Comput. Surv. 55, 8, Article 158 (dec 2022), 33 pages. https://doi.org/10.1145/3546938Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Geoffrey Hinton, Nitish Srivastava, and Kevin Swersky. 2012. Overview of mini-batch gradient descent. Neural Networks for Machine Learning 575, 8 (2012).Google ScholarGoogle Scholar
  8. Qiong Huang, Ashok Veeraraghavan, and Ashutosh Sabharwal. 2015. Tabletgaze: Unconstrained Appearance-based Gaze Estimation in Mobile Tablets. arXiv preprint arXiv:1508.01244 (2015).Google ScholarGoogle Scholar
  9. Mohamed Khamis, Florian Alt, and Andreas Bulling. 2018. The Past, Present, and Future of Gaze-Enabled Handheld Mobile Devices: Survey and Lessons Learned(MobileHCI ’18). Association for Computing Machinery, New York, NY, USA, Article 38, 17 pages. https://doi.org/10.1145/3229434.3229452Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Diederik P Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).Google ScholarGoogle Scholar
  11. Kyle Krafka, Aditya Khosla, Petr Kellnhofer, Harini Kannan, Suchendra Bhandarkar, Wojciech Matusik, and Antonio Torralba. 2016. Eye tracking for everyone. In Proceedings of the IEEE conference on computer vision and pattern recognition. 2176–2184.Google ScholarGoogle ScholarCross RefCross Ref
  12. Alex Krizhevsky, Ilya Sutskever, and Geoffrey E Hinton. 2017. Imagenet classification with deep convolutional neural networks. Commun. ACM 60, 6 (2017), 84–90.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Hsin-Yu Lai, Gladynel Saavedra-Peña, Charles G. Sodini, Vivienne Sze, and Thomas Heldt. 2020. Measuring Saccade Latency Using Smartphone Cameras. IEEE Journal of Biomedical and Health Informatics 24, 3 (2020), 885–897. https://doi.org/10.1109/JBHI.2019.2913846Google ScholarGoogle ScholarCross RefCross Ref
  14. Ildar Rakhmatulin and Andrew T Duchowski. 2020. Deep neural networks for low-cost eye tracking. Procedia Computer Science 176 (2020), 685–694.Google ScholarGoogle ScholarCross RefCross Ref
  15. Sebastian Ruder. 2016. An overview of gradient descent optimization algorithms. arXiv preprint arXiv:1609.04747 (2016).Google ScholarGoogle Scholar
  16. Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov. 2014. Dropout: a simple way to prevent neural networks from overfitting. The journal of machine learning research 15, 1 (2014), 1929–1958.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Nachiappan Valliappan, Na Dai, Ethan Steinberg, Junfeng He, Kantwon Rogers, Venky Ramachandran, Pingmei Xu, Mina Shojaeizadeh, Li Guo, Kai Kohlhoff, 2020. Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11, 1 (2020), 4553.Google ScholarGoogle Scholar
  18. Kang Wang, Shen Wang, and Qiang Ji. 2016. Deep eye fixation map learning for calibration-free eye gaze tracking. In Proceedings of the ninth biennial ACM symposium on eye tracking research & applications. 47–55.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Yifan Xia, Baosheng Liang, Zhaotong Li, and Song Gao. 2022. Gaze Estimation Using Neural Network And Logistic Regression. Comput. J. 65, 8 (2022), 2034–2043.Google ScholarGoogle ScholarCross RefCross Ref
  20. Tong Zhang. 2004. Solving large scale linear prediction problems using stochastic gradient descent algorithms. In Proceedings of the twenty-first international conference on Machine learning. 116.Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2015. Appearance-based gaze estimation in the wild. In Proceedings of the IEEE conference on computer vision and pattern recognition. 4511–4520.Google ScholarGoogle ScholarCross RefCross Ref
  22. Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2017. Mpiigaze: Real-world dataset and deep appearance-based gaze estimation. IEEE transactions on pattern analysis and machine intelligence 41, 1 (2017), 162–175.Google ScholarGoogle Scholar

Index Terms

  1. A Study on Eye Tracking for Mobile Devices Using Deep Learning

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Other conferences
          CompSysTech '23: Proceedings of the 24th International Conference on Computer Systems and Technologies
          June 2023
          201 pages
          ISBN:9798400700477
          DOI:10.1145/3606305

          Copyright © 2023 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 12 September 2023

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article
          • Research
          • Refereed limited

          Acceptance Rates

          Overall Acceptance Rate241of492submissions,49%
        • Article Metrics

          • Downloads (Last 12 months)39
          • Downloads (Last 6 weeks)2

          Other Metrics

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format .

        View HTML Format