ABSTRACT
Eye tracking is a compelling tool for revealing people's spatial-temporal distribution of visual attention. But quality eye tracking hardware is expensive and can only be used with one person at a time. Further, webcam eye tracking systems have significant limitations on head movement and lighting conditions that result in significant data loss and inaccuracies. To address these drawbacks, we introduce a new approach that harnesses the crowd to understand allocation of visual attention. In our approach, crowdsourcing participants use mouse clicks to self-report the positions and trajectory for the following valuable eye tracking measures: first gaze, last gaze and all gazes. We validate our crowdsourcing approach with a user study, which demonstrated good accuracy when compared to a real eye tracker. We then deployed our prototype, GazeCrowd, in a crowdsourcing setting, and showed that it accurately generated gaze heatmaps and trajectory maps. Such an approach will allow designers to evaluate and refine their visual design without requiring the use of limited/expensive eye trackers.
- Andrienko, G., et al. Visual analytics methodology for eye movement studies. IEEE Trans. on Visualization and Computer Graphics 12, 18 (2012), 2889--2898. Google ScholarDigital Library
- Aznar, F., Pujol, M., Rizo, R. Generating saliency maps using human based computation agents. Current Topics in Artificial Intelligence (2010), 252--260. Google ScholarDigital Library
- Baud, O., et al. Trajectory comparison for civil aircraft. Aerospace Conference (2007), 1--9.Google ScholarCross Ref
- Bednarik, R., Tukiainen, M. Validating the restricted focus viewer: a study using eye-movement tracking. Behavior Research Methods 39, 2 (2007), 274--282.Google ScholarCross Ref
- Blackwell, A.F., Jansen, A.R., Marriott, K. Restricted focus viewer: a tool for tracking visual attention. Theory and application of diagrams (2000), 162--177. Google ScholarDigital Library
- Bojko, A. Eye tracking the user experience: a practical guide to research. New York: Rosenfeld (2013), 8--9.Google Scholar
- Borji, A., Itti, L. State-of-the-art in visual attention modeling. IEEE Trans. on Pattern Analysis and Machine Intelligence 35, 1 (2013), 185- 207. Google ScholarDigital Library
- Borji, A., Sihite, D. N., Itti, L. Quantitative analysis of human-model agreement in visual saliency modeling: a comparative study. IEEE Trans. on Image Processing 22, 1 (2012), 55--69. Google ScholarDigital Library
- Chen, M.C., Anderson, J.R., Sohn, M.H. What can a mouse cursor tell us more?: correlation of eye/mouse movements on web browsing. CHI 2001, 281--282. Google ScholarDigital Library
- Cheng, S. Third eye: designing eye gaze visualizations for online shopping social recommendations. Extended Abstracts CSCW 2013, 125--128. Google ScholarDigital Library
- Duchowski, A. T., Price, M. M., Meyer, M., Orero, P. Aggregate gaze visualization with real-time heatmaps. ETRA 2012, 13--20. Google ScholarDigital Library
- Gegenfurtner, A., Lehtinen, E., Säljö, R. Expertise differences in the comprehension of visualizations. Educational Psych. Review 23, 4 (2011), 523--552.Google ScholarCross Ref
- Henderson, J.M. Human gaze control during real-world scene perception. TRENDS in Cognitive Sciences 7, 11, (2003), 498--504.Google ScholarCross Ref
- Henderson, J.M., Hollingworth, A. High level scene perception. Ann. Rev. Psychology 50, (1999), 243--271.Google ScholarCross Ref
- Holmes, T., Zanker, J. Eye on the prize: using overt visual attention to drive fitness for interactive evolutionary computation. GECCO 2008, 1531--1538. Google ScholarDigital Library
- Hornof, A.J., Halverson, T. Cognitive strategies and eye movements for searching hierarchical computer displays. CHI 2003, 249--256. Google ScholarDigital Library
- Itti, L., Koch, C. Computational modeling of visual attention. Nature Reviews Neuroscience 2, 3 (2001), 194--203.Google ScholarCross Ref
- Itti, L. Models of bottom-up attention and saliency. In Tsotsos (Eds.), Neurobiology of Attention, Elsevier (2005), 576--582.Google ScholarCross Ref
- Jarodzka, H., Holmqvist, K., and Nyström, M. A vectorbased, multidimensional scanpath similarity measure. ETRA 2010, 211--218. Google ScholarDigital Library
- Johansen, S. A., San Agustin, J., Skovsgaard, H., Hansen, J. P., Tall, M. Low cost vs. high-end eye tracking for usability testing. CHI 2011, 1177--1182. Google ScholarDigital Library
- Jones, M. N., & Mewhort, D. J. K. Tracking attention with the focus-window technique: The information filter must be calibrated. Behavior Research Methods, Instruments, & Computers, 36 (2004), 270--276.Google Scholar
- Klein, R.M. Inhibition of return. Trends in Cognitive Sciences 4, 4 (2000), 138--147.Google ScholarCross Ref
- Lessing, S., Linge, L. IICap: A new environment for eye tracking data analysis. Master's thesis. University of Lund (2002), Sweden.Google Scholar
- Marshall, S.P. Identifying cognitive state from eye metrics. Aviation, Space, and Environmental Medicine 78, 5(2007), 165--185.Google Scholar
- Nothdurft, H.C. Salience of feature contrast. Neurobiology of attention. Academic Press, (2005).Google Scholar
- Privitera, C. M. The scanpath theory: its definitions and later developments. SPIE 2006, 87--91.Google ScholarCross Ref
- Rodden, K., Fu, X., Aula, A., Spiro, I. Eye-mouse coordination patterns on web search results pages. Extended Abstracts CHI 2008, 2997--3002. Google ScholarDigital Library
- Rudoy, D., et al. Crowdsourcing gaze data collection. CI 2012, 1--8.Google Scholar
- Yarbus, A.L. Eye-movements and vision. Plenum, 1967.Google ScholarCross Ref
Index Terms
- Social Eye Tracking: Gaze Recall with Online Crowds
Recommendations
A Task-Driven Eye Tracking Dataset for Visual Attention Analysis
ACIVS 2015: Proceedings of the 16th International Conference on Advanced Concepts for Intelligent Vision Systems - Volume 9386To facilitate the research in visual attention analysis, we design and establish a new task-driven eye tracking dataset of 47 subjects. Inspired by psychological findings that human visual behavior is tightly dependent on the executed tasks, we ...
Towards Fixation Extraction in Corneal Imaging Based Eye Tracking Data
CHI EA '18: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing SystemsHumans sense most of the environment through their eyes. Hence, gaze is a powerful way to estimate visual attention. Head-mounted or mobile eye tracking is an established tool to analyze the visual behavior of people. Since these systems usually require ...
Exploring visual attention using random walks based eye tracking protocols
Identifying visual attention plays an important role in understanding human behavior and optimizing relevant multimedia applications. In this paper, we propose a visual attention identification method based on random walks. In the proposed method, ...
Comments