ABSTRACT
The complex stochastic nature of eye tracking data calls for exploring sophisticated statistical models to ensure reliable inference in multi-trial eye-tracking experiments. We employ a Bayesian semi-parametric mixed-effects Markov model to compare gaze transition matrices between different experimental factors accommodating individual random effects. The model not only allows us to assess global influences of the external factors on the gaze transition dynamics but also provides comprehension of these effects at a deeper local level. We experimented to explore the impact of recognizing distorted images of artwork and landmarks on the gaze transition patterns. Our dataset comprises sequences representing areas of interest visited when applying a content independent grid to the resulting scan paths in a multi-trial setting. Results suggest that image recognition to some extent affects the dynamics of the transitions while image type played an essential role in the viewing behavior.
Supplemental Material
Available for Download
Supplemental files.
- Dale J Barr. 2008. Analyzing 'visual world' eye-tracking data using multilevel logistic regression. Journal of memory and language 59, 4 (2008), 457--474.Google ScholarCross Ref
- J. Besag and D. Mondal. 2013. Exact Goodness-of-Fit Tests for Markov Chains: Exact Goodness-of-Fit Tests for Markov Chains. Biometrics 69, 2 (June 2013), 488--496.Google Scholar
- Stephen R Ellis and Lawrence Stark. 1986. Statistical dependency in visual scanning. Human factors 28, 4 (1986), 421--438. Google ScholarDigital Library
- Andrew Gelman, Hal S Stern, John B Carlin, David B Dunson, Aki Vehtari, and Donald B Rubin. 2013. Bayesian data analysis. Chapman and Hall/CRC.Google Scholar
- Joseph H Goldberg and Xerxes P Kotval. 1999. Computer interface evaluation using eye movements: methods and constructs. International Journal of Industrial Ergonomics 24, 6 (1999), 631--645.Google ScholarCross Ref
- Laurent Itti, Christof Koch, and Ernst Niebur. 1998. A model of saliency-based visual attention for rapid scene analysis. IEEE Transactions on pattern analysis and machine intelligence 20, 11 (1998), 1254--1259. Google ScholarDigital Library
- Jason Kottke. 2011. Mona Lisa in 140 dots. (2011). https://kottke.org/11/07/mona-lisa-in-140-dotsGoogle Scholar
- Krzysztof Krejtz, Andrew Duchowski, Tomasz Szmidt, Izabela Krejtz, Fernando GonzÃąlez Perilli, Ana Pires, Anna Vilaro, and Natalia Villalobos. 2015. Gaze Transition Entropy. ACM Trans. Appl. Percept. 13, 1 (Dec. 2015), 4:1--4:20. Google ScholarDigital Library
- Giancarlo Pastor, Inmaculada Mora-Jiménez, Riku Jäntti, and Antonio J Caamano. 2015. Mathematics of sparsity and entropy: Axioms core functions and sparse recovery. arXiv preprint arXiv.1501.05126 (2015).Google Scholar
- Mary C. Potter, Brad Wyble, Carl Erick Hagmann, and Emily S. McCourt. 2014. Detecting meaning in RSVP at 13 ms per picture. Attention, Perception, & Psychophysics 76, 2 (Feb. 2014), 270--279.Google ScholarCross Ref
- Abhra Sarkar, Jonathan Chabout, Joshua Jones Macopson, Erich D. Jarvis, and David B. Dunson. 2018. Bayesian Semiparametric Mixed Effects Markov Models With Application to Vocalization Syntax. J. Amer. Statist. Assoc. 0, 0 (Jan. 2018), 1--13.Google Scholar
- Lisa Vandeberg, Samantha Bouwmeester, Bruno R. Bocanegra, and Rolf A. Zwaan. 2013. Detecting cognitive interactions through eye movement transitions. Journal of Memory and Language 69, 3 (Oct. 2013), 445--460.Google ScholarCross Ref
Index Terms
- Analyzing gaze transition behavior using bayesian mixed effects Markov models
Recommendations
Gaze Transition Entropy
This article details a two-step method of quantifying eye movement transitions between areas of interest (AOIs). First, individuals' gaze switching patterns, represented by fixated AOI sequences, are modeled as Markov chains. Second, Shannon's entropy ...
Beyond gaze: preliminary analysis of pupil dilation and blink rates in an fMRI study of program comprehension
EMIP '18: Proceedings of the Workshop on Eye Movements in ProgrammingResearchers have been employing psycho-physiological measures to better understand program comprehension, for example simultaneous fMRI and eye tracking to validate top-down comprehension models. In this paper, we argue that there is additional value in ...
Enhanced gaze interaction using simple head gestures
UbiComp '12: Proceedings of the 2012 ACM Conference on Ubiquitous ComputingWe propose a combination of gaze pointing and head gestures for enhanced hands-free interaction. Instead of the traditional dwell-time selection method, we experimented with five simple head gestures: nodding, turning left/right, and tilting left/right. ...
Comments