Skip to main content
Log in

Clustering of Gaze During Dynamic Scene Viewing is Predicted by Motion

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

Where does one attend when viewing dynamic scenes? Research into the factors influencing gaze location during static scene viewing have reported that low-level visual features contribute very little to gaze location especially when opposed by high-level factors such as viewing task. However, the inclusion of transient features such as motion in dynamic scenes may result in a greater influence of visual features on gaze allocation and coordination of gaze across viewers. In the present study, we investigated the contribution of low- to mid-level visual features to gaze location during free-viewing of a large dataset of videos ranging in content and length. Signal detection analysis on visual features and Gaussian Mixture Models for clustering gaze was used to identify the contribution of visual features to gaze location. The results show that mid-level visual features including corners and orientations can distinguish between actual gaze locations and a randomly sampled baseline. However, temporal features such as flicker, motion, and their respective contrasts were the most predictive of gaze location. Additionally, moments in which all viewers’ gaze tightly clustered in the same location could be predicted by motion. Motion and mid-level visual features may influence gaze allocation in dynamic scenes, but it is currently unclear whether this influence is involuntary or due to correlations with higher order factors such as scene semantics.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

Notes

  1. ICC.1: 2001-4, www.color.org.

References

  1. Findlay JM. Eye scanning and visual search. In: Henderson JM, Ferreira F, editors. The interface of language, vision, and action: eye movements and the visual world. New York, NY, US: Psychology Press; 2004. p. 134–59.

    Google Scholar 

  2. Findlay JM, Gilchrist ID. Active vision: the psychology of looking and seeing. Oxford: University Press; 2003.

    Google Scholar 

  3. Henderson JM. Regarding scenes. Curr Dir Psychol Sci. 2007;16(4):219–22.

    Google Scholar 

  4. Wolfe JM, Horowitz TS. What attributes guide the deployment of visual attention and how do they do it? Nat Rev Neurosci. 2004;5:1–7.

    Google Scholar 

  5. Buswell GT. How people look at pictures: a study of the psychology of perception in art. Chicago: University of Chicago Press; 1935.

    Google Scholar 

  6. Yarbus AL. Eye movements and vision. New York: Plenum Press; 1367.

    Google Scholar 

  7. Baddeley RJ, Tatler BW. High frequency edges (but not contrast) predict where we fixate: a Bayesian system identification analysis. Vision Res. 2006;46:2824–33.

    PubMed  Google Scholar 

  8. Henderson JM, et al. Visual saliency does not account for eye movements during visual search in real-world scenes. In: van Gompel RPG, et al., editors. Eye movements: a window on mind and brain. Oxford: Elsevier; 2007. p. 537–62.

    Google Scholar 

  9. Krieger G, et al. Object and scene analysis by saccadic eye-movements: an investigation with higher-order statistics. Spat Vis. 2000;13(2–3):201–14.

    PubMed  CAS  Google Scholar 

  10. Mannan S, Ruddock KH, Wooding DS. Automatic control of saccadic eye movements made in visual inspection of briefly presented 2-D images. Spat Vis. 1995;9(3):363–86.

    PubMed  CAS  Google Scholar 

  11. Mannan SK, Ruddock KH, Wooding DS. The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images. Spat Vis. 1996;10(3):165–88.

    PubMed  CAS  Google Scholar 

  12. Mannan SK, Ruddock KH, Wooding DS. Fixation sequences made during visual examination of briefly presented 2D images. Spat Vis. 1997;11(2):157–78.

    PubMed  CAS  Google Scholar 

  13. Parkhurst DJ, Niebur E. Scene content selected by active vision. Spat Vis. 2003;6:125–54.

    Google Scholar 

  14. Reinagel P, Zador AM. Natural scene statistics at the centre of gaze. Netw Comput Neural Syst. 1999;10:1–10.

    Google Scholar 

  15. Tatler BW, Baddeley RJ, Gilchrist ID. Visual correlates of fixation selection: effects of scale and time. Vision Res. 2005;45(5):643–59.

    PubMed  Google Scholar 

  16. Parkhurst DJ, Niebur E. Texture contrast attracts overt visual attention in natural scenes. Eur J Neurosci. 2004;19:783–9.

    PubMed  Google Scholar 

  17. Barth E, Zetsche C, Rentschler I. Intrinsic two-dimensional features as textons. J Opt Soc Am A Opt Image Sci Vis. 1998;15:1723–32.

    PubMed  CAS  Google Scholar 

  18. Zetzsche C, et al. Investigation of a sensorimotor system for saccadic scene analysis: an integrated approach. In: Feifer RP, editor. From animals to animats 5. Cambridge, MA: MIT Press; 1998. p. 120–6.

    Google Scholar 

  19. Itti L, Koch C. Computational modelling of visual attention. Nat Rev Neurosci. 2001;2(3):194–203.

    PubMed  CAS  Google Scholar 

  20. Navalpakkam V, Itti L. Modeling the influence of task on attention. Vision Res. 2005;45(2):205–31.

    PubMed  Google Scholar 

  21. Parkhurst D, Law K, Niebur E. Modeling the role of salience in the allocation of overt visual attention. Vision Res. 2002;42(1):107–23.

    PubMed  Google Scholar 

  22. Pomplun M, Reingold EM, Shen J. Area activation: a computational model of saccadic selectivity in visual search. Cogn Sci. 2003;27:299–312.

    Google Scholar 

  23. Rao RPN, et al. Eye movements in iconic visual search. Vision Res. 2002;42(11):1447–63.

    PubMed  Google Scholar 

  24. Sun Y, et al. A computer vision model for visual-object-based attention and eye movements. Comput Vis Image Underst. 2008;112(2):126–42.

    Google Scholar 

  25. Zelinsky GJ. A theory of eye movements during target acquisition. Psychol Rev. 2008;115(4):787–835.

    PubMed  Google Scholar 

  26. Koch C, Ullman S. Shifts in selective visual-attention: towards the underlying neural circuitry. Hum Neurobiol. 1985;4(4):219–27.

    PubMed  CAS  Google Scholar 

  27. Treisman AM, Gelade G. Feature-integration theory of attention. Cogn Psychol. 1980;12(1):97–136.

    PubMed  CAS  Google Scholar 

  28. Itti L. A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Res. 2000;40:1489–506.

    PubMed  CAS  Google Scholar 

  29. Foulsham T, Underwood G How does the purpose of inspection influence the potency of visual salience in scene perception? 2007.

  30. Einhauser W, Spain M, Perona P. Objects predict fixations better than early saliency. J Vis. 2008;8(14):11–26.

    Google Scholar 

  31. Chen X, Zelinsky GJ. Real-world visual search is dominated by top-down guidance. Vision Res. 2006;46(24):4118–33.

    PubMed  Google Scholar 

  32. Foulsham T, Underwood G. Can the purpose of inspection influence the potency of visual salience in scene perception? Perception. 2006;35:236.

    Google Scholar 

  33. Foulsham T, Underwood G. What can saliency models predict about eye movements? Spatial and sequential aspects of fixations during encoding and recognition. J Vis. 2008;8(2).

  34. Henderson JM, Malcolm GL, Schandl C. Searching in the dark: cognitive relevance drives attention in real-world scenes. Psychon Bull Rev. 2009;16:850–6.

    PubMed  Google Scholar 

  35. Henderson JM, et al. Eye movements and picture processing during recognition. Percept Psychophys. 2003;65(5):725–34.

    PubMed  Google Scholar 

  36. Hayhoe M, Land M. Coordination of eye and hand movements in a normal visual environment. Invest Ophthalmol Vis Sci. 1999;40(4):S380.

    Google Scholar 

  37. Land MF, Hayhoe M. In what ways do eye movements contribute to everyday activities? Vision Res. 2001;41(25–26):3559–65.

    PubMed  CAS  Google Scholar 

  38. Land MF, Lee DN. Where we look when we steer. Nature. 1994;369(6483):742–4.

    PubMed  CAS  Google Scholar 

  39. Land MF, McLeod P. From eye movements to actions: how batsmen hit the ball. Nat Neurosci. 2000;3(12):1340–5.

    PubMed  CAS  Google Scholar 

  40. Yantis S. Control of visual attention. In: Pashler H, editor. Attention. London: Psychology Press; 1998. p. 223–56.

    Google Scholar 

  41. Yantis S, Hillstrom AP. Stimulus-driven attentional capture: evidence from equiluminant visual objects. J Exp Psychol Hum Percept Perform. 1994;20(1):95–107.

    PubMed  CAS  Google Scholar 

  42. Yantis S, Jonides J. Abrupt visual onsets and selective attention: evidence from visual search. J Exp Psychol Hum Percept Perform. 1984;10(5):601–21.

    PubMed  CAS  Google Scholar 

  43. Brockmole JR, Henderson JM. Prioritization of new objects in real-world scenes: evidence from eye movements. J Exp Psychol Hum Percept Perform. 2005;31(5):857–68.

    PubMed  Google Scholar 

  44. Brockmole JR, Henderson JM. Object appearance disappearance, and attention prioritization in real-world scenes. Psychon Bull Rev. 2005;12(6):1061–7.

    PubMed  Google Scholar 

  45. Matsukura M, Brockmole JR, Henderson JM. Overt attentional prioritization of new objects and feature changes during real-world scene viewing. Vis Cogn. 2009;6(7):835–55.

    Google Scholar 

  46. Simons DJ. Attentional capture and inattentional blindness. Trends Cogn Sci. 2000;4(4):147–55.

    PubMed  Google Scholar 

  47. Berg DJ, et al. Free viewing of dynamic stimuli by humans and monkeys. J Vis. 2009;9(5):1–15.

    Google Scholar 

  48. Carmi R, Itti L. Visual causes versus correlates of attentional selection in dynamic scenes. Vision Res. 2006;46(26):4333–45.

    PubMed  Google Scholar 

  49. Carmi R, Itti L. The role of memory in guiding attention during natural vision. J Vis. 2006;6(9):898–914.

    PubMed  Google Scholar 

  50. Itti L. Quantifying the contribution of low-level saliency to human eye movements in dynamic scenes. Vis Cogn. 2005;12(6):1093–123.

    Google Scholar 

  51. Itti L. Quantitative modelling of perceptual salience at human eye position. Vis Cogn. 2006;14(4–8):959–84.

    Google Scholar 

  52. Le Meur O, Le Callet P, Barba D. Predicting visual fixations on video based on low-level visual features. Vision Res. 2007;47(19):2483–98.

    PubMed  Google Scholar 

  53. t’Hart BM, et al. Gaze allocation in natural stimuli: comparing free exploration to head-fixed viewing conditions. Vis Cogn. 2009;17(6/7):1132–58.

    Google Scholar 

  54. Goldstein RB, Woods RL, Peli E. Where people look when watching movies: do all viewers look at the same place? Comput Biol Med. 2007;37(7):957–64.

    PubMed  Google Scholar 

  55. Hasson U, et al. Neurocinematics: the Neuroscience of Film. Proje J Movies Mind. 2008;2(1):1–26.

    Google Scholar 

  56. Marchant P, et al. Are you seeing what i’m seeing? an eye-tracking evaluation of dynamic scenes. Digit Creat. 2009;20(3):153–63.

    Google Scholar 

  57. May J, Dean MP, Barnard PJ. Using film cutting techniques in interface design. Hum Comput Interact. 2003;18:325–72.

    Google Scholar 

  58. Nyström M, Holmqvist K. Effect of compressed offline foveated video on viewing behavior and subjective quality. ACM Transon Multimed Comput Commun Appl. 2010;6(1):1–16.

    Google Scholar 

  59. Sawahata Y, et al. Determining comprehension and quality of tv programs using eye-gaze tracking. Pattern Recognit. 2008;41(5):1610–26.

    Google Scholar 

  60. Smith TJ, Henderson JM. Attentional synchrony in static and dynamic scenes. J Vis. 2008;8(6):773.

    Google Scholar 

  61. Stelmach LB, Tam WJ, Hearty PJ. Static and dynamic spatial resolution in image coding: an investigation of eye movements. In human vision, visual processing, and digital display II. 1991.

  62. Tosi V, Mecacci L, Pasquali E. Pasquali, scanning eye movements made when viewing film: preliminary observations. Int J Neurosci. 1997;92(1/2):47–52.

    PubMed  CAS  Google Scholar 

  63. Smith TJ. An attentional theory of continuity editing, in informatics. Edinburgh, UK: University of Edinburgh; 2006. p. 400.

    Google Scholar 

  64. Cristino F, Baddeley R. The nature of the visual representations involved in eye movements when walking down the street. Vis cogn. 2009;17(6–7):880–903.

    Google Scholar 

  65. Tatler BW. The central fixation bias in scene viewing: selecting an optimal viewing position independently of motor biases and image feature distributions. J Vis. 2007;7(14):1–17.

    PubMed  Google Scholar 

  66. Tseng PH, et al. Quantifying centre bias of observers in free viewing of dynamic natural scenes. J Vis. 2009;9(7):1–16.

    Google Scholar 

  67. Torralba A, Oliva A. Statistics of natural image categories. Netw-Comput Neural Syst. 2003;14(3):391–412.

    Google Scholar 

  68. Palmer SE. Vision Science: photons to phenomenology. Cambridge, Mass: London MIT Press; 1999. Xxii. p. 810. ill. 26 cm.

  69. Murphy BJ. Pattern thresholds for moving and stationary gratings during smooth eye movement. Vision Res. 1978;18(5):521–30.

    PubMed  CAS  Google Scholar 

  70. Carandini M, Heeger DJ. Summation and division by neurons in primate visual cortex. Science. 1994;264(5163):1333–6.

    PubMed  CAS  Google Scholar 

  71. Itti L, Koch C, Niebur E. A model of saliency-based visual attention for rapid scene analysis. IEEE Trans Pattern Anal Mach Intell. 1998;20(11):1254–9.

    Google Scholar 

  72. Bex PJ, Makous W. Spatial frequency, phase, and the contrast of natural images. J Opt Soc Am A Opt Image Sci Vis. 2002;19(6):1096–106.

    PubMed  Google Scholar 

  73. Moulden B, Kingdom F, Gatley LF. The standard-deviation of luminance as a metric for contrast in random-dot images. Perception. 1990;19(1):79–101.

    PubMed  CAS  Google Scholar 

  74. Lee DK, et al. Attention activates winner-take-all competition among visual filters. Nat Neurosci. 1999;2(4):375–81.

    PubMed  CAS  Google Scholar 

  75. Rosenholtz R. A simple saliency model predicts a number of motion popout phenomena. Vision Res. 1999;39(19):3157–63.

    PubMed  CAS  Google Scholar 

  76. Einhauser W, Konig P. Does luminance-contrast contribute to a saliency map for overt visual attention? Eur J Neurosci. 2003;17(5):1089–97.

    PubMed  Google Scholar 

  77. Itti L, Baldi P. A principled approach to detecting surprising events in video. In: SchmidC, Soatto S, Tomasi C, editors. 2005 IEEE computer society conference on computer vision and pattern recognition, Vol 1, proceedings. Los Alamitos: Ieee Computer Soc; 2005. p. 631–37.

  78. Rosin PL. A simple method for detecting salient regions. Pattern Recognit. 2009;42(11):2363–71.

    Google Scholar 

  79. Park SJ, Shin JK, Lee M. Biologically inspired saliency map model for bottom-up visual attention. In: Bulthoff HH et al., editors. Biologically motivated computer vision, proceedings. Berlin: Springer; 2002. p. 418-426.

  80. Privitera CM, Stark LW. Human-vision-based selection of image processing algorithms for planetary exploration. IEEE Trans Image Process. 2003;12(8):917–23.

    PubMed  Google Scholar 

  81. Sobel I, Feldman G. A 3 × 3 isotropic gradient operator for image processing; presented talk at the Stanford Artificial Project 1968. In: Duda RO, Hart PE, editors. Pattern classification and scene analysis. New York: Wiley; 1973. p. 271–2.

    Google Scholar 

  82. Attneave F. Some informational aspects of visual perception. Psychol Rev. 1954;61(3):183–93.

    PubMed  CAS  Google Scholar 

  83. Schmid C, Mohr R, Bauckhage C. Evaluation of interest point detectors. Int J Comput Vis. 2000;37(2):151–72.

    Google Scholar 

  84. Noble JA. Finding corners. Image Vis Comput. 1988;6(2):121–8.

    Google Scholar 

  85. Moravec HP. Obstacle avoidance and navigation in the real world by a seeing robot rover, in Robotics Institute, Carnegie Mellon University & doctoral dissertation, Stanford University. 1980.

  86. Harris C, Stephens M. A combined corner and edge detector. in 4th Alvey Vision Conference. Manchester; 1988.

  87. Blakemore C, Campbell FW. On existence of neurones in human visual system selectively sensitive to orientation and size of retinal images. J Physiol. 1969;203(1):237–60.

    PubMed  CAS  Google Scholar 

  88. Hubel DH, Wiesel TN. Receptive fields, binocular interaction and functional architecture in the cat’s visual cortex. J Physiol. 1962;160(1):106–54.

    PubMed  CAS  Google Scholar 

  89. Field DJ. Relations between the statistics of natural images and the response properties of cortical-cells. J Opt Soc Am A Opt Image Sci Vis. 1987;4(12):2379–94.

    CAS  Google Scholar 

  90. Anstis SM. Apparent movement. In: Held RH, Leibowitz W, Teuber H-L, editors. Handbook of sensory physiology. New York: Springer; 1977.

    Google Scholar 

  91. Anstis SM, Mackay DM. The perception of apparent movement. Philos Trans R Soc Lond B Biol Sci. 1980;290(1038):153–68.

    PubMed  CAS  Google Scholar 

  92. Ullman, S., The interpretation of visual motion. The MIT Press Series in Artificial Intelligence. Cambridge, Mass: M.I.T. Press; 1979. Xiii. p. 229. ill. 24 cm.

  93. Adelson EH, Bergen JR. Spatiotemporal energy models for the perception of motion. J Opt Soc Am A Opt Image Sci Vis. 1985;2(2):284–99.

    CAS  Google Scholar 

  94. Theeuwes J. Abrupt luminance change pops out; abrupt color change does not. Percept Psychophys. 1995;57(5):637–44.

    PubMed  CAS  Google Scholar 

  95. Moulden B, Renshaw J, Mather G. 2 channels for flicker in the human visual-system. Perception. 1984;13(4):387–400.

    PubMed  CAS  Google Scholar 

  96. Horn BKP, Schunck BG. Determining optical flow. Artif Intell. 1981;17(1–3):185–203.

    Google Scholar 

  97. Lucas BD, Kanade T. An iterative image registration technique with an application to stereo vision. In Proceedings of the DARPA imaging understanding workshop. 1981. p. 121–130.

  98. Baddeley R. Searching for filters with ‘interesting’ output distributions: an uninteresting direction to explore? Netw-Comput Neural Syst. 1996;7(2):409–21.

    CAS  Google Scholar 

  99. Green DM, Swets JA. Signal detection theory and psychophysics. New York: Wiley: 1966. Xi. p. 455. illus24 cm.

  100. Privitera CM, Stark LW. Algorithms for defining visual regions-of-interest: comparison with eye fixations. IEEE Trans Pattern Anal Mach Intell. 2000;22(9):970–82.

    Google Scholar 

  101. Latimer CR. Eye-movement data: cumulative fixation time and cluster-analysis. Behav Res Methods Instrum Comput. 1988;20(5):437–70.

    Google Scholar 

  102. Santella A, DeCarlo D. Robust clustering of eye movement recordings for quantification of visual interest. In eye tracking research & application. San Antonio, Texas: ACM Press; 2004.

  103. Bishop, C.M., Pattern recognition and machine learning. Information science and statistics. New York: Springer; 2006. xx. p. 738.

  104. Torre V, Poggio T. On edge detection. IEEE Trans Pattern Anal Mach Intell. 1984;8(2):147–63.

    Google Scholar 

  105. Mannan SK, Ruddock KH, Wooding DS. The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images. Spat Vis. 1996;10:165–88.

    PubMed  CAS  Google Scholar 

  106. Mannan SK, Ruddock KH, Wooding DS. Fixation sequences made during visual examination of briefly presented 2D images. Spat Vis. 1997;11:157–78.

    PubMed  CAS  Google Scholar 

  107. Torralba A, et al. Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. Psychol Rev. 2006;113(4):766–86.

    PubMed  Google Scholar 

  108. Vincent BT, et al. Do we look at lights? Using mixture modelling to distinguish between low- and high-level factors in natural image viewing. Vis Cogn. 2009;17(6/7):856–79.

    Google Scholar 

  109. Zacks JM, et al. Human brain activity time-locked to perceptual event boundaries. Nat Neurosci. 2001;4:651–5.

    PubMed  CAS  Google Scholar 

  110. Speer NK, Swallow KM, Zacks JM. Activation of human motion processing areas during event perception. Cogn Affect Behav Neurosci. 2003;3:335–45.

    PubMed  Google Scholar 

  111. Zacks JM, et al. Visual motion and the neural correlates of event perception. Brain Res. 2006;1076:150–62.

    PubMed  CAS  Google Scholar 

  112. Bordwell D, Thompson K. Film art: an introduction, vol. 6. USA: McGraw Hill; 2001.

    Google Scholar 

  113. Nyström M, Holmqvist K. Variable resolution images and their effects on eye movements during free-viewing. In human vision and electronic imaging XII. San Jose, CA; 2007.

  114. Nyström M, Holmqvist K. Semantic override of low-level features in image viewing: both initially and overall. J Eye Mov Res. 2008;2(2):1–11.

    Google Scholar 

  115. Frank MC, Vul E, Johnson SP. Development of infants’ attention to faces during the first year. Cognition. 2009;110:160–70.

    PubMed  Google Scholar 

Download references

Acknowledgments

The authors would like to thank members of the Edinburgh Visual Cognition laboratory including Antje Nuthmann and George Malcolm for their comments and feedback, and Josie Landback for running experiments. This project was funded by the Leverhulme Trust (Ref F/00-158/BZ) and the ESRC (RES-062-23-1092) awarded to John M. Henderson.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Parag K. Mital.

Additional information

All eye-movement data and visualization tools can be obtained from: http://thediemproject.wordpress.com.

Appendices

Appendix 1

See Table 3.

Table 3 List of videos used in this study

Appendix 2

Continuation of Fig. 6.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Mital, P.K., Smith, T.J., Hill, R.L. et al. Clustering of Gaze During Dynamic Scene Viewing is Predicted by Motion. Cogn Comput 3, 5–24 (2011). https://doi.org/10.1007/s12559-010-9074-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-010-9074-z

Keywords

Navigation