skip to main content
10.1145/3314111.3319847acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Inferring target locations from gaze data: a smartphone study

Published:25 June 2019Publication History

ABSTRACT

Although smartphones are widely used in everyday life, studies of viewing behavior mainly employ desktop computers. This study examines whether closely spaced target locations on a smartphone can be decoded from gaze. Subjects wore a head-mounted eye tracker and fixated a target that successively appeared at 30 positions spaced by 10.0 × 9.0 mm. A "hand-held" (phone in subject's hand) and a "mounted" (phone on surface) condition were conducted. Linear-mixed-models were fitted to examine whether gaze differed between targets. T-tests on root-mean-squared errors were calculated to evaluate the deviation between gaze and targets. To decode target positions from gaze data we trained a classifier and assessed its performance for every subject/condition. While gaze positions differed between targets (main effect "target"), gaze deviated from the real positions. The classifier's performance for the 30 locations ranged considerably between subjects ("mounted": 30 to 93 % accuracy; "hand-held": 8 to 100 % accuracy).

References

  1. Douglas Bates, Martin Mächler, Ben Bolker, and Steve Walker. 2014. Fitting linear mixed-effects models using lme4. arXiv preprint arXiv:1406.5823 (2014).Google ScholarGoogle Scholar
  2. Evgenia Dimitriadou, Kurt Hornik, Friedrich Leisch, David Meyer, and Andreas Weingessel. 2008. Misc functions of the Department of Statistics (e1071), TU Wien.Google ScholarGoogle Scholar
  3. Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing: Adjunct publication. ACM, 1151--1160. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Florian Krause and Oliver Lindemann. 2014. Expyriment: A Python library for cognitive and neuroscientific experiments. Behavior Research Methods 46, 2 (2014), 416--428.Google ScholarGoogle ScholarCross RefCross Ref
  5. Dachuan Liu, Bo Dong, Xing Gao, and Haining Wang. 2015. Exploiting eye tracking for smartphone authentication. In International Conference on Applied Cryptography and Network Security. Springer, 457--477.Google ScholarGoogle ScholarCross RefCross Ref
  6. Jeff J MacInnes, Shariq Iqbal, John Pearson, and Elizabeth N Johnson. 2018. Wearable Eye-tracking for Research: Automated dynamic gaze mapping and accuracy/precision comparisons across devices. bioRxiv (2018), 299925.Google ScholarGoogle Scholar
  7. Lucas Paletta, Helmut Neuschmied, Michael Schwarz, Gerald Lodron, Martin Pszeida, Stefan Ladstätter, and Patrick Luley. 2014. Smartphone eye tracking toolbox: accurate gaze recovery on mobile displays. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 367--68. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. R Core Team. 2018. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. https://www.R-project.org/Google ScholarGoogle Scholar
  9. Zofija Tupikovskaja-Omovie and David Tyler. 2018. Mobile consumer shopping journey in fashion retail: eye tracking mobile apps and websites. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. ACM, 89. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Zofija Tupikovskaja-Omovie, DJ Tyler, Sam Dhanapala, and Steve Hayes. 2015. Mobile App versus Website: A Comparative Eye-Tracking Case Study of Topshop. International Journal of Social, Behavioral, Educational, Economic, Business and Industrial Engineering 9 (2015).Google ScholarGoogle Scholar
  11. H Wickham. 2018. Francois R. dplyr: A Grammar of Data Manipulation. R package version 0.4. 3. 2015.Google ScholarGoogle Scholar
  12. Chris Keefer Williams, Allan Engelhardt, Tony Cooper, Zachary Mayer, Andrew Ziem, Luca Scrucca, Yuan Tang, Can Candan, Tyler Hunt, and Maintainer Max Kuhn. 2018. Package 'caret'.Google ScholarGoogle Scholar

Index Terms

  1. Inferring target locations from gaze data: a smartphone study

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image ACM Conferences
            ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
            June 2019
            623 pages
            ISBN:9781450367097
            DOI:10.1145/3314111

            Copyright © 2019 ACM

            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 25 June 2019

            Permissions

            Request permissions about this article.

            Request Permissions

            Check for updates

            Qualifiers

            • short-paper

            Acceptance Rates

            Overall Acceptance Rate69of137submissions,50%

            Upcoming Conference

            ETRA '24
            The 2024 Symposium on Eye Tracking Research and Applications
            June 4 - 7, 2024
            Glasgow , United Kingdom

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader