Skip to main content

Eye Fixation Metrics for Large Scale Evaluation and Comparison of Information Visualizations

  • Conference paper
  • First Online:

Part of the book series: Mathematics and Visualization ((MATHVISUAL))

Abstract

An observer’s eye movements are often informative about how the observer interacts with and processes a visual stimulus. Here, we are specifically interested in what eye movements reveal about how the content of information visualizations is processed. Conversely, by pooling over many observers’ worth of eye movements, what can we learn about the general effectiveness of different visualizations and the underlying design principles employed? The contribution of this manuscript is to consider these questions at a large data scale, with thousands of eye fixations on hundreds of diverse information visualizations. We survey existing methods and metrics for collective eye movement analysis, and consider what each can tell us about the overall effectiveness of different information visualizations and designs at this large data scale.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    Dataset available at http://massvis.mit.edu.

  2. 2.

    Saccades are intervals between fixations: the motion of the eyes from one fixation point to the next. The analysis of saccades is beyond the scope of the present manuscript, for which additional metrics would be necessary [40, 51].

  3. 3.

    The eye has to be recorded as “still” according to prespecified parameters [24, 60]. We use the standard thresholds set by the EyeLink Eyetracker [63].

  4. 4.

    Typically, the sigma of the Gaussian is chosen to be equal to 1 or 2 of visual angle, to model the uncertainty in viewing location.

  5. 5.

    Goldberg and Helfman [17] discuss implementation choices and issues arising when working with AOIs and fixations.

  6. 6.

    This has also been called inter-subject consistency [67], the inter-observer (IO) model [5], and inter-observer congruency (IOC) [42].

  7. 7.

    Area under receiver operating characteristic curve (AUROC or AUC) is the most commonly used similarity metric [14]. Note that IOC analysis can be extended to the ordering, instead of just the distribution, of fixations [18, 31, 42, 45].

  8. 8.

    Labeled visualizations, eye movement data, and code for the visualizations in this manuscript are available at http://massvis.mit.edu.

  9. 9.

    We suggest the following surveys: [4, 5, 16, 38, 65].

References

  1. Andrienko, G., Andrienko, N., Burch, M., Weiskopf, D.: Visual analytics methodology for eye movement studies. IEEE TVCG 18 (12), 2889–2898 (2012)

    Google Scholar 

  2. Berg, A.C., Berg, T.L., Daume III, H., Dodge, J., Goyal, A., Han, X., Mensch, A., Mitchell, M., Sood, A., Stratos, K., et al.: Understanding and predicting importance in images. In: Computer Vision and Pattern Recognition, pp. 3562–3569. IEEE, Providence, RI (2012)

    Google Scholar 

  3. Blascheck, T., Kurzhals, K., Raschke, M., Burch, M., Weiskopf, D., Ertl, T.: State-of-the-art of visualization for eye tracking data. In: Proceedings of EuroVis, vol. 2014, Swansea (2014)

    Google Scholar 

  4. Borji, A., Sihite, D.N., Itti, L.: Quantitative analysis of human-model agreement in visual saliency modeling: a comparative study. IEEE Trans. Image Process. 22 (1), 55–69 (2013)

    Article  MathSciNet  Google Scholar 

  5. Borji, A., Tavakoli, H.R., Sihite, D.N., Itti, L.: Analysis of scores, datasets, and models in visual saliency prediction. In: IEEE International Conference on Computer Vision, Sydney (2013)

    Book  Google Scholar 

  6. Borkin, M., Bylinskii, Z., Kim, N., Bainbridge, C.M., Yeh, C., Borkin, D., Pfister, H., Oliva, A.: Beyond memorability: visualization recognition and recall. IEEE TVCG 22 (1), 519–528 (2016)

    Google Scholar 

  7. Borkin, M.A., Vo, A.A., Bylinskii, Z., Isola, P., Sunkavalli, S., Oliva, A., Pfister, H.: What makes a visualization memorable? IEEE TVCG 19 (12), 2306–2315 (2013)

    Google Scholar 

  8. Burch, M., Konevtsova, N., Heinrich, J., Hoeferlin, M., Weiskopf, D.: Evaluation of traditional, orthogonal, and radial tree diagrams by an eye tracking study. IEEE TVCG 17 (12), 2440–2448 (2011)

    Google Scholar 

  9. Bylinskii, Z., Judd, T., Borji, A., Itti, L., Durand, F., Oliva, A., Torralba, A.: MIT Saliency Benchmark. http://saliency.mit.edu/

  10. Byrne, M.D., Anderson, J.R., Douglass, S., Matessa, M.: Eye tracking the visual search of click-down menus. In: SIGCHI, pp. 402–409. ACM, New York (1999)

    Google Scholar 

  11. Carpenter, P.A., Shah, P.: A model of the perceptual and conceptual processes in graph comprehension. J. Exp. Psychol. Appl. 4 (2), 75 (1998)

    Article  Google Scholar 

  12. Cowen, L., Ball, L.J., Delin, J.: An eye movement analysis of web page usability. In: People and Computers XVI, pp. 317–335. Springer, London (2002)

    Google Scholar 

  13. Duchowski, A.T.: A breadth-first survey of eye-tracking applications. Behav. Res. Methods Instrum. Comput. 34 (4), 455–470 (2002)

    Article  Google Scholar 

  14. Fawcett, T.: An introduction to ROC analysis. Pattern Recognit. Lett. 27 (8), 861–874 (2006)

    Article  MathSciNet  Google Scholar 

  15. Fitts, P.M., Jones, R.E., Milton, J.L.: Eye movements of aircraft pilots during instrument-landing approaches. Ergon. Psychol. Mech. Models Ergon 3, 56 (2005)

    Google Scholar 

  16. Frintrop, S., Rome, E., Christensen, H.I.: Computational visual attention systems and their cognitive foundations: a survey. ACM Trans. Appl. Percept. (TAP) 7 (1), 6 (2010)

    Google Scholar 

  17. Goldberg, J.H., Helfman, J.I.: Comparing information graphics: a critical look at eye tracking. In: BELIV’10, Atlanta, pp. 71–78. ACM (2010)

    Google Scholar 

  18. Goldberg, J.H., Helfman, J.I.: Scanpath clustering and aggregation. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, Austin, pp. 227–234. ACM (2010)

    Google Scholar 

  19. Goldberg, J.H., Kotval, X.P.: Computer interface evaluation using eye movements: methods and constructs. Int. J. Ind. Ergon. 24 (6), 631–645 (1999)

    Article  Google Scholar 

  20. Graf, W., Krueger, H.: Ergonomic evaluation of user-interfaces by means of eye-movement data. In: Proceedings of the Third International Conference on Human-Computer Interaction, Boston, pp. 659–665. Elsevier Science Inc. (1989)

    Google Scholar 

  21. Grant, E.R., Spivey, M.J.: Eye movements and problem solving guiding attention guides thought. Psychol. Sci. 14 (5), 462–466 (2003)

    Article  Google Scholar 

  22. Gygli, M., Grabner, H., Riemenschneider, H., Nater, F., Gool, L.: The interestingness of images. In: International Conference on Computer Vision, Sydney, pp. 1633–1640 (2013)

    Google Scholar 

  23. Hayhoe, M.: Advances in relating eye movements and cognition. Infancy 6 (2), 267–274 (2004)

    Article  Google Scholar 

  24. Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., Van de Weijer, J.: Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press, Oxford/New York (2011)

    Google Scholar 

  25. Huang, W.: Using eye tracking to investigate graph layout effects. In: APVIS’07, Sydney, pp. 97–100 (2007)

    Google Scholar 

  26. Huang, W., Eades, P.: How people read graphs. In: APVIS’05, Sydney, vol. 45, pp. 51–58 (2005)

    Google Scholar 

  27. Huang, W., Eades, P., Hong, S.-H.: A graph reading behavior: geodesic-path tendency. In: PacificVis’09, Kyoto, pp. 137–144 (2009)

    Google Scholar 

  28. Isola, P., Xiao, J., Torralba, A., Oliva, A.: What makes an image memorable? In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Colorado Springs, pp. 145–152. IEEE (2011)

    Google Scholar 

  29. Jacob, R., Karn, K.S.: Eye tracking in human-computer interaction and usability research: ready to deliver the promises. Mind 2 (3), 4 (2003)

    Google Scholar 

  30. Jiang, M., Huang, S., Duan, J., Zhao, Q.: Salicon: saliency in context. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston (2015)

    Google Scholar 

  31. Josephson, S., Holmes, M.E.: Visual attention to repeated internet images: testing the scanpath theory on the world wide web. In: Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, Palm Beach Gardens, pp. 43–49. ACM (2002)

    Google Scholar 

  32. Just, M.A., Carpenter, P.A.: Eye fixations and cognitive processes. Cogn. Psychol. 8 (4), 441–480 (1976)

    Article  Google Scholar 

  33. Karayev, S., Trentacoste, M., Han, H., Agarwala, A., Darrell, T., Hertzmann, A., Winnemoeller, H.: Recognizing image style (2013). In: Proceedings British Machine Vision Conference (2014)

    Google Scholar 

  34. Khosla, A., Raju, A.S., Torralba, A., Oliva, A.: Understanding and predicting image memorability at a large scale. In: Proceedings of the IEEE International Conference on Computer Vision, Santiago, pp. 2390–2398 (2015)

    Google Scholar 

  35. Khosla, A., Xiao, J., Torralba, A., Oliva, A.: Memorability of image regions. In: NIPS, Lake Tahoe, pp. 305–313 (2012)

    Google Scholar 

  36. Kim, N.W., Bylinskii, Z., Borkin, M.A., Oliva, A., Gajos, K.Z., Pfister, H.: A crowdsourced alternative to eye-tracking for visualization understanding. In: CHI’15 Extended Abstracts, Seoul, pp. 1349–1354. ACM (2015)

    Google Scholar 

  37. Kim, S.-H., Dong, Z., Xian, H., Upatising, B., Yi, J.S.: Does an eye tracker tell the truth about visualizations? Findings while investigating visualizations for decision making. IEEE TVCG 18 (12), 2421–2430 (2012)

    Google Scholar 

  38. Kimura, A., Yonetani, R., Hirayama, T.: Computational models of human visual attention and their implementations: a survey. IEICE Trans. Inf. Syst. 96-D, 562-578 (2013)

    Article  Google Scholar 

  39. Körner, C.: Eye movements reveal distinct search and reasoning processes in comprehension of complex graphs. Appl. Cogn. Psychol. 25 (6), 893–905 (2011)

    Article  Google Scholar 

  40. Kowler, E.: The role of visual and cognitive processes in the control of eye movement. Rev. Oculomot. Res. 4, 1–70 (1989)

    Google Scholar 

  41. Lankford, C.: Gazetracker: software designed to facilitate eye movement analysis. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, Palm Beach Gardens, pp. 51–55. ACM (2000)

    Google Scholar 

  42. Le Meur, O., Baccino, T.: Methods for comparing scanpaths and saliency maps: strengths and weaknesses. Behav. Res. Methods 45 (1), 251–266 (2013)

    Google Scholar 

  43. Le Meur, O., Baccino, T., Roumy, A.: Prediction of the inter-observer visual congruency (IOVC) and application to image ranking. In: Proceedings of the 19th ACM International Conference on Multimedia, pp. 373–382. ACM, New York (2011)

    Google Scholar 

  44. Loftus, G.R., Mackworth, N.H.: Cognitive determinants of fixation location during picture viewing. J. Exp. Psychol. Hum. Percept. Perform. 4 (4), 565 (1978)

    Article  Google Scholar 

  45. Noton, D., Stark, L.: Scanpaths in saccadic eye movements while viewing and recognizing patterns. Vis. Res. 11 (9), 929 (1971)

    Article  Google Scholar 

  46. O’Donovan, P., Agarwala, A., Hertzmann, A.: Learning layouts for single-page graphic designs. IEEE TVCG 20 (8), 1200–1213 (2014)

    Google Scholar 

  47. Pan, B., Hembrooke, H.A., Gay, G.K., Granka, L.A., Feusner, M.K., Newman, J.K.: The determinants of web page viewing behavior: an eye-tracking study. In: Proceedings of the 2004 Symposium on Eye Tracking Research & Applications, San Antonio, pp. 147–154. ACM (2004)

    Google Scholar 

  48. Pelz, J.B., Canosa, R., Babcock, J.: Extended tasks elicit complex eye movement patterns. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, Palm Beach Gardens, pp. 37–43. ACM (2000)

    Google Scholar 

  49. Pohl, M., Schmitt, M., Diehl, S.: Comparing the readability of graph layouts using eyetracking and task-oriented analysis. In: Computational Aesthetics in Graphics, Visualization and Imaging, Lisbon, pp. 49–56 (2009)

    Google Scholar 

  50. Pomplun, M., Ritter, H., Velichkovsky, B.: Disambiguating complex visual information: towards communication of personal views of a scene. Perception 25, 931–948 (1996)

    Article  Google Scholar 

  51. Poole, A., Ball, L.J.: Eye tracking in HCI and usability research. Encycl. Hum. Comput. Interact. 1, 211–219 (2006)

    Article  Google Scholar 

  52. Poole, A., Ball, L.J., Phillips, P.: In search of salience: a response-time and eye-movement analysis of bookmark recognition. In: People and Computers XVIII, pp. 363–378. Springer, London (2004)

    Google Scholar 

  53. Raschke, M., Blascheck, T., Richter, M., Agapkin, T., Ertl, T.: Visual analysis of perceptual and cognitive processes. In: IEEE International Conference on Information Visualization Theory and Applications (IVAPP), pp. 284–291 (2014).

    Google Scholar 

  54. Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychol. Bull. 124 (3), 372 (1998)

    Article  Google Scholar 

  55. Rayner, K., Rotello, C.M., Stewart, A.J., Keir, J., Duffy, S.A.: Integrating text and pictorial information: eye movements when looking at print advertisements. J. Exp. Psychol. Appl. 7 (3), 219 (2001)

    Article  Google Scholar 

  56. Reinecke, K., Yeh, T., Miratrix, L., Mardiko, R., Zhao, Y., Liu, J., Gajos, K.Z.: Predicting users’ first impressions of website aesthetics with a quantification of perceived visual complexity and colorfulness. In: SIGCHI, San Jose, pp. 2049–2058. ACM (2013)

    Google Scholar 

  57. Ristovski, G., Hunter, M., Olk, B., Linsen, L.: EyeC: coordinated views for interactive visual exploration of eye-tracking data. In: 17th International Conference on Information Visualisation, London, pp. 239–248 (2013)

    Google Scholar 

  58. Rudoy, D., Goldman, D.B., Shechtman, E., Zelnik-Manor, L.: Crowdsourcing gaze data collection (2012). arXiv preprint arXiv:1204.3367

    Google Scholar 

  59. Russell, B.C., Torralba, A., Murphy, K.P., Freeman, W.T.: LabelMe: a database and web-based tool for image annotation. Int. J. Comput. Vis. 77 (1–3), 157–173 (2008)

    Article  Google Scholar 

  60. Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, Palm Beach Gardens, pp. 71–78. ACM (2000)

    Google Scholar 

  61. Shen, C., Zhao, Q.: Webpage saliency. In: European Conference on Computer Vision, Zurich, pp. 33–46. Springer (2014)

    Google Scholar 

  62. Siirtola, H., Laivo, T., Heimonen, T., Raiha, K.-J. Visual perception of parallel coordinate visualizations. In: International Conference on Information Visualisation, Barcelona, pp. 3–9 (2009)

    Google Scholar 

  63. SR Research Ltd.: EyeLink Data Viewer User’s Manual, Version 1.8.402 (2008)

    Google Scholar 

  64. Tsang, H.Y., Tory, M., Swindells, C.: eSeeTrack: visualizing sequential fixation patterns. IEEE TVCG 16 (6), 953–962 (2010)

    Google Scholar 

  65. Tsotsos, J.K., Rothenstein, A.: Computational models of visual attention. Scholarpedia 6 (1), 6201 (2011)

    Article  Google Scholar 

  66. West, J.M., Haake, A.R., Rozanski, E.P., Karn, K.S.: eyePatterns: software for identifying patterns and similarities across fixation sequences. In: Proceedings of the 2006 Symposium on Eye Tracking Research & Applications, San Diego, pp. 149–154. ACM (2006)

    Google Scholar 

  67. Wilming, N., Betz, T., Kietzmann, T.C., König, P.: Measures and limits of models of fixation selection. PLoS ONE 6 (9), e24038 (2011)

    Article  Google Scholar 

  68. Wooding, D.S.: Eye movements of large populations: deriving regions of interest, coverage, and similarity using fixation maps. Behav. Res. Methods Instrum. Comput. 34 (4), 518–528 (2002)

    Article  Google Scholar 

  69. Wu, M.M.A., Munzner, T.: SEQIT: visualizing sequences of interest in eye tracking data. IEEE TVCG 22 (1), 449–458 (2015)

    Google Scholar 

Download references

Acknowledgements

This work was partly funded by awards from Google and Xerox to A.O., NSERC Postgraduate Doctoral Scholarship (PGS-D) to Z.B., NSF Graduate Research Fellowship Program and NSERC Discovery grant to M.B., and a Kwanjeong Educational Foundation grant to N.K.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zoya Bylinskii .

Editor information

Editors and Affiliations

Appendix

Appendix

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Bylinskii, Z., Borkin, M.A., Kim, N.W., Pfister, H., Oliva, A. (2017). Eye Fixation Metrics for Large Scale Evaluation and Comparison of Information Visualizations. In: Burch, M., Chuang, L., Fisher, B., Schmidt, A., Weiskopf, D. (eds) Eye Tracking and Visualization. ETVIS 2015. Mathematics and Visualization. Springer, Cham. https://doi.org/10.1007/978-3-319-47024-5_14

Download citation

Publish with us

Policies and ethics