ABSTRACT
Distinct cognitive processing stages in mental spatial transformation tasks can be identified in oculomotor behavior. We recorded eye movements whilst participants performed a mental folding task. Gaze behaviour was analyzed to provide insights into the relationship of task difficulty, gaze proportion on each stimulus, gaze switches between stimuli, and reaction times. We found a monotonic decrease in switch frequency and reference object gaze proportions with increasing difficulty level. Further, we found that these measures of gaze behaviour are related to the time taken to perform the mental transformation. We propose that the observed patterns of eye movements are indicative of distinct cognitive stages during mental folding. Lastly, further exploratory analyses are discussed.
- John R. Anderson, Daniel Bothell, Michael D. Byrne, Scott Douglass, Christian Lebiere, and Yulin Qin. 2004. An Integrated Theory of the Mind. Psychological Review 111, 4 (2004), 1036–1060.Google ScholarCross Ref
- Douglas Bates, Reinhold Kliegl, Shravan Vasishth, and Harald Baayen. 2018. Parsimonious Mixed Models. (2018). arXiv:1506.04967Google Scholar
- Douglas Bates, Martin Mächler, Ben Bolker, and Steve Walker. 2015. Fitting Linear Mixed-Effects Models Using lme4. Journal of Statistical Software 67, 1 (2015), 1–48. https://doi.org/10.18637/jss.v067.i01Google ScholarCross Ref
- Michael Bone, Marie St-Laurent, Christa Dang, Douglas McQuiggan, Jennifer Ryan, and Bradley Buchsbaum. 2018. Eye Movement Reinstatement and Neural Reactivation During Mental Imagery. Cerebral Cortex 29 (02 2018).Google Scholar
- F. Chan, T. Barry, Antoni Chan, and Janet Hsiao. 2019. Hidden Markov modelling of eye movements in social anxiety: a data-driven machine-learning approach to eye-tracking research in psychopathology. In 2019 ADAA Annual Conference. Chicago, IL, USA.Google Scholar
- Nelson Cowan. 1999. An embedded-processes model of working memory. Models of working memory: Mechanisms of active maintenance and executive control 20(1999), 506.Google Scholar
- Gary Feng. 2006. Eye movements as time-series random variables: A stochastic model of eye movement control in reading. Cognitive Systems Research 7, 1 (March 2006), 70–95. https://doi.org/10.1016/j.cogsys.2005.07.004Google ScholarDigital Library
- Justin Harris, Kathy Hirsh-Pasek, and Nora S. Newcombe. 2013. Understanding spatial transformations: similarities and differences between mental rotation and mental folding. Cognitive Processing 14, 2 (May 2013), 105–115. https://doi.org/10.1007/s10339-013-0544-6Google ScholarCross Ref
- Mary Hayhoe and Dana Ballard. 2005. Eye movements in natural behavior. Trends in Cognitive Sciences 9, 4 (April 2005), 188–194.Google ScholarCross Ref
- Marcel A. Just and Patricia A. Carpenter. 1976. Eye fixations and cognitive processes. Cognitive Psychology 8, 4 (Oct. 1976), 441–480. https://doi.org/10.1016/0010-0285(76)90015-3Google ScholarCross Ref
- Peter Kiefer, Ioannis Giannopoulos, and Martin Raubal. 2014. Where am I? Investigating map matching during self-localization with mobile eye tracking in an urban environment. Transactions in GIS 18, 5 (2014), 660–686.Google ScholarCross Ref
- Stephen M. Kosslyn. 1996. Image and brain: The resolution of the imagery debate (1st ed ed.). MIT Press, Cambridge, MA.Google Scholar
- Bruno Laeng and Dinu-Stefan Teodorescu. 2002. Eye scanpaths during visual imagery reenact those of perception of the same visual scene. Cognitive Science 26, 2 (2002), 207–231.Google ScholarCross Ref
- David J. Mack, Sandro Belfanti, and Urs Schwarz. 2017. The effect of sampling rate and lowpass filters on saccades - A modeling approach. Behavior Research Methods 49, 6 (2017), 2146–2162.Google ScholarCross Ref
- David Noton and Lawrence Stark. 1971. Eye movements and visual perception. Scientific American 224, 6 (1971), 34–43.Google ScholarCross Ref
- Kai Preuss, Leonie Raddatz, and Nele Russwinkel. 2019. An implementation of Universal Spatial Transformative Cognition in ACT-R. In Proceedings of the 17th International Conference on Cognitive Modelling, T.C. Stewart (Ed.). University of Waterloo, Waterloo, Canada, 144–150.Google Scholar
- R Core Team. 2019. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. https://www.R-project.org/Google Scholar
- Keith Rayner and Susan A. Duffy. 1986. Lexical complexity and fixation times in reading: Effects of word frequency, verb complexity, and lexical ambiguity. Memory & Cognition 14, 3 (May 1986), 191–201. https://doi.org/10.3758/BF03197692Google ScholarCross Ref
- Dario D Salvucci. 1999. Mapping eye movements to cognitive processes. Ph.D. Dissertation. Carnegie Mellon University, Pittsburgh, PA, USA. Advisor(s) Anderson, John R.Google Scholar
- Roger N. Shepard and Christine Feng. 1972. A chronometric study of mental paper folding. Cognitive Psychology 3, 2 (1972), 228–243.Google ScholarCross Ref
- Roger N. Shepard and Jacqueline Metzler. 1971. Mental Rotation of Three-Dimensional Objects. Science 171(1971), 701–703.Google ScholarCross Ref
- Stefan M. Wierda, Hedderik van Rijn, Niels A. Taatgen, and Sander Martens. 2012. Pupil dilation deconvolution reveals the dynamics of attention at high temporal resolution. Proceedings of the National Academy of Sciences 109, 22 (May 2012), 8456–8460. https://doi.org/10.1073/pnas.1201858109Google ScholarCross Ref
- Rebecca Wright, William L. Thompson, Giorgio Ganis, Nora S. Newcombe, and Stephen M. Kosslyn. 2008. Training generalized spatial skills. Psychonomic Bulletin & Review 15, 4 (2008), 763–771.Google ScholarCross Ref
- Jiguo Xue, Chunyong Li, Cheng Quan, Yiming Lu, Jingwei Yue, and Chenggang Zhang. 2017. Uncovering the cognitive processes underlying mental rotation: an eye-movement study. Scientific Reports 7, 1 (Dec. 2017). https://doi.org/10.1038/s41598-017-10683-6Google ScholarCross Ref
- Alfred L. Yarbus. 1967. Eye Movements and Vision. Plenum Press, New York, NY, USA.Google Scholar
Recommendations
Analysing EOG signal features for the discrimination of eye movements with wearable devices
PETMEI '11: Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interactionEye tracking research in human-computer interaction and experimental psychology traditionally focuses on stationary devices and a small number of common eye movements. The advent of pervasive eye tracking promises new applications, such as eye-based ...
Eye gaze tracking with free head movements using a single camera
SoICT '10: Proceedings of the 1st Symposium on Information and Communication TechnologyThe problem of eye gaze tracking has been researched and developed for a long time. The most difficult problem in the non-intrusive system of eye gaze tracking is the problem of head movements. Some of existing methods have to use two cameras and an ...
Analysis of eye tracking movements using FIR median hybrid filters
ETRA '00: Proceedings of the 2000 symposium on Eye tracking research & applicationsThis paper presents an approach of using FIR Median Hybrid Filters for analysis of eye tracking movements. The proposed filter can remove the eye blink artifact from the eye movement signal. The background of the project is described first. The whole ...
Comments