skip to main content
10.1145/2578153.2578167acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Towards fine-grained fixation analysis: distilling out context dependence

Published: 26 March 2014 Publication History

Abstract

In this paper, we explore the problem of analyzing gaze patterns towards attributing greater meaning to observed fixations. In recent years, there have been a number of efforts that attempt to categorize fixations according to their properties. Given that there are a multitude of factors that may contribute to fixational behavior, including both bottom-up and top-down influences on neural mechanisms for visual representation and saccadic control, efforts to better understand factors that may contribute to any given fixation may play an important role in augmenting raw fixation data. A grand objective of this line of thinking is in explaining the reason for any observed fixation as a combination of various latent factors. In the current work, we do not seek to solve this problem in general, but rather to factor out the role of the holistic structure of a scene as one observable, and quantifiable factor that plays a role in determining fixational behavior. Statistical methods and approximations to achieve this are presented, and supported by experimental results demonstrating the efficacy of the proposed methods.

References

[1]
Abolhassani, A. H., and Clark, J. J. 2011. Realization of an inverse yarbus process via hidden markov models for visual-task inference. J of Vision 11, 11, 218--218.
[2]
Borji, A., Sihite, D. N., and Itti, L. 2013. What stands out in a scene? a study of human explicit saliency judgment. Vision research 91, 62--77.
[3]
Bruce, N., and Tsotsos, J. 2005. Saliency based on information maximization. In Advances in neural information processing systems, 155--162.
[4]
Bruce, N. D. B., and Tsotsos, J. K. 2009. Saliency, attention, and visual search: An information theoretic approach. J of Vision 9, 3.
[5]
Cardoso, J.-F. 1998. Multidimensional independent component analysis. In Acoustics, Speech and Signal Processing, 1998. Proceedings of the 1998 IEEE International Conference on, vol. 4, IEEE, 1941--1944.
[6]
Chen, X., and Zelinsky, G. J. 2006. Real-world visual search is dominated by top-down guidance. Vision research 46, 24, 4118--4133.
[7]
Clark, J. J. 1999. Spatial attention and latencies of saccadic eye movements. Vision Research 39, 3, 585--602.
[8]
Findlay, J. M. 1982. Global visual processing for saccadic eye movements. Vision research 22, 8, 1033--1045.
[9]
Follet, B., Le Meur, O., and Baccino, T. 2011. New insights into ambient and focal visual fixations using an automatic classification algorithm. i-Perception 2, 6, 592.
[10]
Greene, M. R., Liu, T., and Wolfe, J. M. 2012. Reconsidering yarbus: A failure to predict observers task from eye movement patterns. Vision research 62, 1--8.
[11]
Hayhoe, M., and Ballard, D. 2005. Eye movements in natural behavior. Trends in cognitive sciences 9, 4, 188--194.
[12]
Oliva, A., and Torralba, A. 2001. Modeling the shape of the scene: A holistic representation of the spatial envelope. International journal of computer vision 42, 3, 145--175.
[13]
Oliva, A., and Torralba, A. 2006. Building the gist of a scene: The role of global image features in recognition. Progress in brain research 155, 23--36.
[14]
Pannasch, S., and Velichkovsky, B. M. 2009. Distractor effect and saccade amplitudes: Further evidence on different modes of processing in free exploration of visual images. Visual Cognition 17, 6-7, 1109--1131.
[15]
Rothkopf, C., Ballard, D. H., and Hayhoe, M. 2007. Task and context determine where you look. J of Vision 7, 14.
[16]
Tatler, B. W. 2007. The central fixation bias in scene viewing: Selecting an optimal viewing position independently of motor biases and image feature distributions. J of Vision 7, 14.
[17]
Torralba, A., Oliva, A., Castelhano, M. S., and Henderson, J. M. 2006. Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. Psychological review 113, 4, 766.
[18]
Tseng, P.-H., Carmi, R., Cameron, I. G., Munoz, D. P., and Itti, L. 2009. Quantifying center bias of observers in free viewing of dynamic natural scenes. J of Vision 9, 7.
[19]
Yarbus, A. L., Haigh, B., and Rigss, L. A. 1967. Eye movements and vision, vol. 2. Plenum press New York.

Cited By

View all
  • (2016)A Deeper Look at Saliency: Feature Contrast, Semantics, and Beyond2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR.2016.62(516-524)Online publication date: Jun-2016
  • (2016)Predicting task from eye movementsNeurocomputing10.1016/j.neucom.2016.05.047207:C(653-668)Online publication date: 26-Sep-2016

Index Terms

  1. Towards fine-grained fixation analysis: distilling out context dependence

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '14: Proceedings of the Symposium on Eye Tracking Research and Applications
    March 2014
    394 pages
    ISBN:9781450327510
    DOI:10.1145/2578153
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 March 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. context
    2. eye tracking
    3. gaze analytics
    4. saccades

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    ETRA '14
    ETRA '14: Eye Tracking Research and Applications
    March 26 - 28, 2014
    Florida, Safety Harbor

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    ETRA '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 01 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2016)A Deeper Look at Saliency: Feature Contrast, Semantics, and Beyond2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR.2016.62(516-524)Online publication date: Jun-2016
    • (2016)Predicting task from eye movementsNeurocomputing10.1016/j.neucom.2016.05.047207:C(653-668)Online publication date: 26-Sep-2016

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media