ABSTRACT
Although smartphones are widely used in everyday life, studies of viewing behavior mainly employ desktop computers. This study examines whether closely spaced target locations on a smartphone can be decoded from gaze. Subjects wore a head-mounted eye tracker and fixated a target that successively appeared at 30 positions spaced by 10.0 × 9.0 mm. A "hand-held" (phone in subject's hand) and a "mounted" (phone on surface) condition were conducted. Linear-mixed-models were fitted to examine whether gaze differed between targets. T-tests on root-mean-squared errors were calculated to evaluate the deviation between gaze and targets. To decode target positions from gaze data we trained a classifier and assessed its performance for every subject/condition. While gaze positions differed between targets (main effect "target"), gaze deviated from the real positions. The classifier's performance for the 30 locations ranged considerably between subjects ("mounted": 30 to 93 % accuracy; "hand-held": 8 to 100 % accuracy).
- Douglas Bates, Martin Mächler, Ben Bolker, and Steve Walker. 2014. Fitting linear mixed-effects models using lme4. arXiv preprint arXiv:1406.5823 (2014).Google Scholar
- Evgenia Dimitriadou, Kurt Hornik, Friedrich Leisch, David Meyer, and Andreas Weingessel. 2008. Misc functions of the Department of Statistics (e1071), TU Wien.Google Scholar
- Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing: Adjunct publication. ACM, 1151--1160. Google ScholarDigital Library
- Florian Krause and Oliver Lindemann. 2014. Expyriment: A Python library for cognitive and neuroscientific experiments. Behavior Research Methods 46, 2 (2014), 416--428.Google ScholarCross Ref
- Dachuan Liu, Bo Dong, Xing Gao, and Haining Wang. 2015. Exploiting eye tracking for smartphone authentication. In International Conference on Applied Cryptography and Network Security. Springer, 457--477.Google ScholarCross Ref
- Jeff J MacInnes, Shariq Iqbal, John Pearson, and Elizabeth N Johnson. 2018. Wearable Eye-tracking for Research: Automated dynamic gaze mapping and accuracy/precision comparisons across devices. bioRxiv (2018), 299925.Google Scholar
- Lucas Paletta, Helmut Neuschmied, Michael Schwarz, Gerald Lodron, Martin Pszeida, Stefan Ladstätter, and Patrick Luley. 2014. Smartphone eye tracking toolbox: accurate gaze recovery on mobile displays. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 367--68. Google ScholarDigital Library
- R Core Team. 2018. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. https://www.R-project.org/Google Scholar
- Zofija Tupikovskaja-Omovie and David Tyler. 2018. Mobile consumer shopping journey in fashion retail: eye tracking mobile apps and websites. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. ACM, 89. Google ScholarDigital Library
- Zofija Tupikovskaja-Omovie, DJ Tyler, Sam Dhanapala, and Steve Hayes. 2015. Mobile App versus Website: A Comparative Eye-Tracking Case Study of Topshop. International Journal of Social, Behavioral, Educational, Economic, Business and Industrial Engineering 9 (2015).Google Scholar
- H Wickham. 2018. Francois R. dplyr: A Grammar of Data Manipulation. R package version 0.4. 3. 2015.Google Scholar
- Chris Keefer Williams, Allan Engelhardt, Tony Cooper, Zachary Mayer, Andrew Ziem, Luca Scrucca, Yuan Tang, Can Candan, Tyler Hunt, and Maintainer Max Kuhn. 2018. Package 'caret'.Google Scholar
Index Terms
- Inferring target locations from gaze data: a smartphone study
Recommendations
The past, present, and future of gaze-enabled handheld mobile devices: survey and lessons learned
MobileHCI '18: Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and ServicesWhile first-generation mobile gaze interfaces required special-purpose hardware, recent advances in computational gaze estimation and the availability of sensor-rich and powerful devices is finally fulfilling the promise of pervasive eye tracking and ...
Reliability and Validity of Low Temporal Resolution Eye Tracking Systems in Cognitive Performance Tasks
Eye tracking experiments are an important contribution to human computer interaction HCI research. Eye movements indicate attention, information processing, and cognitive state. Oculomotor activity is usually captured with high temporal resolution eye ...
GazeCast: Using Mobile Devices to Allow Gaze-based Interaction on Public Displays
ETRA '23: Proceedings of the 2023 Symposium on Eye Tracking Research and ApplicationsGaze is promising for natural and spontaneous interaction with public displays, but current gaze-enabled displays require movement-hindering stationary eye trackers or cumbersome head-mounted eye trackers. We propose and evaluate GazeCast – a novel ...
Comments