skip to main content
10.1145/2957265.2965015acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
extended-abstract

Minimal sequential gaze models for inferring walkers' tasks

Published: 06 September 2016 Publication History

Abstract

Eye movements in extended sequential behavior are known to reflect task demands much more than low-level feature saliency. However, the more naturalistic the task is the more difficult it becomes to establish what cognitive processes a particular task elicits moment by moment. Here we ask the question, which sequential model is required to capture gaze sequences so that the ongoing task can be inferred reliably. Specifically, we consider eye movements of human subjects navigating a walkway while avoiding obstacles and approaching targets in a virtual environment. We show that Hidden-Markov Models, which have been used extensively in modeling human sequential behavior, can be augmented with few state variables describing the egocentric position of subjects relative to objects in the environment to dramatically increase successful classification of the ongoing task and to generate gaze sequences, that are very close to those observed in human subjects.

References

[1]
Ali Borji, Dicky N. Sihite, and Laurent Itti. 2013. What stands out in a scene? A study of human explicit saliency judgment. Vision Res. 91, 0 (2013), 62--77.
[2]
Gabriel Diaz, Joseph Cooper, Constantin Rothkopf, and Mary Hayhoe. 2013. Saccades to future ball location reveal memory-based prediction in a virtual-reality interception task. J. Vis. 13, 1 (2013), 20.
[3]
John M. Findlay and Iain D. Gilchrist. 2003. Active vision: The psychology of looking and seeing. Oxford University Press.
[4]
J. Randall Flanagan and Roland S. Johansson. 2003. Action plans used in action observation. Nature 424, 6950 (2003), 769--771.
[5]
Mary Hayhoe and Dana Ballard. 2005. Eye movements in natural behavior. Trends. Cogn. Sci. 9, 4 (2005), 188--94.
[6]
John M. Henderson. 2003. Human gaze control during real-world scene perception. Trends. Cogn. Sci. 7, 11 (2003), 498--504.
[7]
R. S. Johansson, G. Westling, A. Bäckström, and J. R. Flanagan. 2001. Eye-hand coordination in object manipulation. J. Neurosci. 21 (2001), 6917--6932.
[8]
Michael Land and Benjamin Tatler. 2009. Looking and acting: vision and eye movements in natural behaviour. Oxford University Press.
[9]
Andrew Liu and Dario Salvucci. 2001. Modeling and prediction of human driver behavior. In Intl. Conference on HCI.
[10]
Jiri Najemnik and Wilson S. Geisler. 2005. Optimal eye movement strategies in visual search. Nature 434, 7031 (2005), 387--391.
[11]
Vidhya Navalpakkam, Christof Koch, Antonio Rangel, and Pietro Perona. 2010. Optimal reward harvesting in complex perceptual environments. Proc. Natl. Acad. Sci. USA. 107, 11 (2010), 5232--5237.
[12]
Matthew F. Peterson and Miguel P. Eckstein. 2012. Looking just below the eyes is optimal across face recognition tasks. Proc. Natl. Acad. Sci. USA. 109, 48 (2012), 3314--3323.
[13]
Raymond D. Rimey and Christopher M. Brown. 1991. Controlling eye movements with hidden Markov models. International Journal of Computer Vision 7, 1 (1991), 47--65.
[14]
Constantin A. Rothkopf and Dana H. Ballard. 2009. Image statistics at the point of gaze during human navigation. Vis Neurosci 26, 1 (2009), 81--92.
[15]
Constantin A. Rothkopf, Dana H. Ballard, and Mary M. Hayhoe. 2007. Task and context determine where you look. J. Vis. 7, 14 (2007), 16.
[16]
Rainer Stiefelhagen, Michael Finke, Jie Yang, and Alex Waibel. 1999. From gaze to focus of attention. In Visual Information and Information Systems. Springer, 765--772.
[17]
Antonio Torralba, Aude Oliva, Monica S. Castelhano, and John M. Henderson. 2006. Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. Psychol. Rev. 113, 4 (2006), 766.
[18]
S-CH Yang, Mate Lengyel, and Daniel M. Wolpert. in press. Active sensing in the categorization of visual patterns. eLife (in press).
[19]
A. Yarbus. 1967. Eye Movements and Vision. Plenum Press, New York.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
MobileHCI '16: Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct
September 2016
664 pages
ISBN:9781450344135
DOI:10.1145/2957265
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 06 September 2016

Check for updates

Author Tags

  1. hidden Markov models
  2. inferring human actions
  3. sequential gaze models

Qualifiers

  • Extended-abstract

Conference

MobileHCI '16
Sponsor:

Acceptance Rates

Overall Acceptance Rate 202 of 906 submissions, 22%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)2
  • Downloads (Last 6 weeks)0
Reflects downloads up to 17 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2020)Inferring Intent and Action from Gaze in Naturalistic BehaviorCognitive Analytics10.4018/978-1-7998-2460-2.ch074(1464-1482)Online publication date: 2020
  • (2018)Smooth GazePersonal and Ubiquitous Computing10.1007/s00779-018-1115-822:3(489-501)Online publication date: 1-Jun-2018
  • (2017)Inferring Intent and Action from Gaze in Naturalistic BehaviorInternational Journal of Mobile Human Computer Interaction10.5555/3213394.32133989:4(41-57)Online publication date: 1-Oct-2017
  • (2017)Inferring Intent and Action from Gaze in Naturalistic BehaviorInternational Journal of Mobile Human Computer Interaction10.4018/IJMHCI.20171001049:4(41-57)Online publication date: 1-Oct-2017

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media