skip to main content
10.1145/2669557.2669569acmotherconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Towards analyzing eye tracking data for evaluating interactive visualization systems

Authors Info & Claims
Published:10 November 2014Publication History

ABSTRACT

Eye tracking can be a suitable evaluation method for determining which regions and objects of a stimulus a human viewer perceived. Analysts can use eye tracking as a complement to other evaluation methods for a more holistic assessment of novel visualization techniques beyond time and error measures. Up to now, most stimuli in eye tracking are either static stimuli or videos. Since interaction is an integral part of visualization, an evaluation should include interaction. In this paper, we present an extensive literature review on evaluation methods for interactive visualizations. Based on the literature review we propose ideas for analyzing eye movement data from interactive stimuli. This requires looking critically at challenges induced by interactive stimuli. The first step is to collect data using different study methods. In our case, we look at using eye tracking, interaction logs, and thinking-aloud protocols. In addition, this requires a thorough synchronization of the mentioned study methods. To analyze the collected data new analysis techniques have to be developed. We investigate existing approaches and how we can adapt them to new data types as well as sketch ideas how new approaches can look like.

References

  1. A. Aaltonen, A. Hyrskykari, and K.-J. Räihä. 101 spots, or how do users read menus? In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 132--139. ACM, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. E. W. Anderson. Evaluating visualization using cognitive measures. In Proceedings of the 2012 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 5:1--5:4. ACM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. K. Andrews. Evaluating information visualization. In Proceedings of the 2006 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 1--5. ACM, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. T. Blascheck, K. Kurzhals, M. Raschke, M. Burch, D. Weiskopf, and T. Ertl. State-of-the-art of visualization for eye tracking data. In EuroVis STAR, pages 63--82, 2014.Google ScholarGoogle Scholar
  5. T. Boren and J. Ramey. Thinking aloud: reconciling theory and practice. IEEE Transactions on Professional Communication, 43(3):261--278, 2000.Google ScholarGoogle ScholarCross RefCross Ref
  6. M. Burch, N. Konevtsova, J. Heinrich, M. Höferlin, and D. Weiskopf. Evaluation of traditional, orthogonal, and radial tree diagrams by an eye tracking study. IEEE Transactions on Visualization and Computer Graphics, 17(12):2440--2448, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. M. D. Byrne, J. R. Anderson, S. Douglass, and M. Matessa. Eye tracking the visual search of click-down menus. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 402--409. ACM, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. S. Carpendale. Evaluating information visualizations. In A. Kerren, J. Stasko, J.-D. Fekete, and C. North, editors, Information Visualization, volume 4950, pages 19--45. Springer, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. M. C. Chen, J. R. Anderson, and M. H. Sohn. What can a mouse cursor tell us more?: Correlation of eye/mouse movements on web browsing. In CHI '01 Extended Abstracts on Human Factors in Computing Systems, pages 281--282. ACM, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. S. Conversy, C. Hurter, and S. Chatty. A descriptive model of visual scanning. In Proceedings of the 2010 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 35--42. ACM, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. A. Duchowski. A breadth-first survey of eye-tracking applications. Behavior Research Method, Instruments, & Computers, 34(4):455--470, 2002.Google ScholarGoogle Scholar
  12. C. Ehmke and S. Wilson. Identifying web usability problems from eye-tracking data. In Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI...But Not As We Know It - Volume 1, volume 1, pages 119--128. ACM, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. G. Ellis and A. Dix. An explorative analysis of user evaluation studies in information visualizations. In Proceedings of the 2006 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 1--7. ACM, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. A. Endert and C. North. Interaction junk: User interaction-based evaluation of visual analytic systems. In Proceedings of the 2012 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 7:1--7:3. ACM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. J. H. Goldberg and J. I. Helfman. Comparing information graphics: A critical look at eye tracking. In Proceedings of the 2010 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 71--78, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. J. H. Goldberg and X. P. Kotval. Computer interface evaluation using eye movements: Methods and constructs. International Journal of Industrial Ergonomics, 24:631--645, 1999.Google ScholarGoogle ScholarCross RefCross Ref
  17. J. H. Goldberg, M. J. Stimson, M. Lewenstein, N. Scott, and A. M. Wichansky. Eye tracking in web search tasks: Design implications. In Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, pages 51--58. ACM, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. L. A. Granka, T. Joachims, and G. Gay. Eye-tracking analysis of user behavior in www search. In Proceedings of the 27th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, pages 478--479. ACM, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. S. Greenberg and B. Buxton. Usability evaluation considered harmful (some of the time). In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 111--120. ACM, 2008. Google ScholarGoogle ScholarCross RefCross Ref
  20. D. Groth and B. Murphy. Tracking user interactions within visualizations. In IEEE Symposium on Information Visualization, pages 9--10. IEEE, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Z. Guan and E. Cutrell. An eye tracking study of the effect of target rank on web search. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 417--420. ACM, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. M. Guzdial, P. Santos, A. Badre, S. Hudson, and M. Gray. Analyzing and visualizing log files: A computational science of usability. Technical report, GVU Center, College of Computing, Georgia Institute of Technology, 1994.Google ScholarGoogle Scholar
  23. J. J. Hendrickson. Performance, preference, and visual scan patterns on a menu-based system: Implications for interface design. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 217--222. ACM, 1989. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. K. Holmqvist, M. Nyström, R. Andersson, R. Dewhurst, H. Jarodzka, and J. Van de Weijer. Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press, 1 edition, 2011.Google ScholarGoogle Scholar
  25. W. Huang. Using eye tracking to investigate graph layout effects. In 6th International Asia-Pacific Symposium on Visualization, pages 97--100. IEEE, 2007.Google ScholarGoogle ScholarCross RefCross Ref
  26. W. Huang, P. Eades, and S.-H. Hong. Beyond time and error: A cognitive approach to the evaluation of graph drawings. In Proceedings of the 2008 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 3:1--3:8. ACM, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. T. Isenberg, P. Isenberg, J. Chen, M. Sedlmair, and T. Möller. A systematic review on the practice of evaluating visualization. IEEE Transactions on Visualization and Computer Graphics, 19(12):2818--2827, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. M. Y. Ivory and M. A. Hearst. The state of the art in automating usability evaluation of user interfaces. ACM Computing Surveys, 33(4):470--516, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. R. Jacob and K. Karn. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In J. Hyönä, R. Radach, and H. Deubel, editors, The Mind's Eye, pages 573--605. Elsevier Science BV, 2003.Google ScholarGoogle ScholarCross RefCross Ref
  30. M. A. Just and P. A. Carpenter. A theory of reading: From eye fixations to comprehension. Psychological Review, 87(4):329--354, 1980.Google ScholarGoogle ScholarCross RefCross Ref
  31. R. Kosara, C. G. Healey, V. Interrante, D. H. Laidlaw, and C. Ware. Thoughts on user studies: Why, how, and when. IEEE Transactions on Computer Graphics and Applications, 23:20--25, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. K. Kurzhals, F. Heimerl, and D. Weiskopf. ISeeCube: Visual analysis of gaze data for video. In Proceedings of the 2014 Symposium on Eye Tracking Research & Applications, pages 43--50, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. H. Lam, E. Bertini, P. Isenberg, C. Plaisant, and S. Carpendale. Empirical studies in information visualization: Seven scenarios. IEEE Transactions on Visualization and Computer Graphics, 18(9):1520--1536, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. H. Lam and T. Munzner. Increasing the utility of quantitative empirical studies for meta-analysis. In Proceedings of the 2008 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 2:1--2:7. ACM, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. J. Lazar, J. Feng, and H. Hochheiser. Research Methods in Human-Computer Interaction. Wiley, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. E. Mayr, M. Smuc, and H. Risku. Many roads lead to rome: Mapping users problem-solving strategies. Information Visualization, 10(3):232--247, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. T. Munzner. A nested model for visualization design and validation. IEEE Transactions on Visualization and Computer Graphics, 15(6):921--928, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. C. North. Toward measuring visualization insight. IEEE Transactions on Computer Graphics and Applications, 26(3):6--9, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. E. Peck, B. Yuksel, L. Harrison, A. Ottley, and R. Chang. Towards a 3-dimensional model of individual cognitive differences. In Proceedings of the 2012 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 6:1--6:6. ACM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. W. Pike, J. Stasko, R. Chang, and T. O'Connell. The science of interaction. Information Visualization, 8(4):263--274, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. C. Plaisant. The challenge of information visualization evaluation. In Proceedings of the Working Conference on Advanced Visual Interfaces, pages 109--116. ACM, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. M. Pohl. Methodologies for the analysis of usage patterns in information visualization. In Proceedings of the 2012 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 17:1--17:3. ACM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. M. Pohl, M. Smuc, and E. Mayr. The user puzzle - explaining the interaction with visual analytics systems. IEEE Transactions on Visualization and Computer Graphics, 18(12):2908--2916, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. M. Pohl, S. Wiltner, and S. Miksch. Exploring information visualization: Describing different interaction patterns. In Proceedings of the 2010 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 16--23. ACM, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. A. Poole and L. Ball. Eye tracking in Human-Computer Interaction and Usability Research: Current Status and Future Prospects, chapter E, pages 211--219. Idea Group Inc., 2006.Google ScholarGoogle Scholar
  46. C. Privitera and L. Stark. Algorithms for defining visual regions-of-interest: Comparison with eye fixations. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(9):970--982, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. R. W. Reeder, P. Pirolli, and S. K. Card. WebEyeMapper and WebLogger: Tools for analyzing eye tracking data collected in web-use studies. In CHI '01 Extended Abstracts on Human Factors in Computing Systems, pages 19--20. ACM, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. M. Rester and M. Pohl. Methods for the evaluation of an interactive infovis tool supporting exploratory reasoning processes. In Proceedings of the 2006 AVI Workshop on Beyond Time and Errors - Novel Evaluation Methods for Information Visualization, pages 1--6. ACM, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. N. H. Riche. Beyond system logging: Human logging for evaluating information visualization. In Proceedings of the 2010 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization. ACM, 2010.Google ScholarGoogle Scholar
  50. P. Saraiya, C. North, and K. Duca. An insight-based methodology for evaluating bioinformatics visualizations. IEEE Transactions on Visualization and Computer Graphics, 11(4):443--456, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. P. Saraiya, C. North, and K. Duca. Comparing benchmark task and insight evaluation methods on timeseries graph visualizations. In Proceedings of the 2010 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 55--62. ACM, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. M. Sedlmair, M. Meyer, and T. Munzner. Design study methodology: Reflections from the trenches and the stacks. IEEE Transactions on Visualization and Computer Graphics, 18(12):2431--2440, 2012.Google ScholarGoogle ScholarDigital LibraryDigital Library
  53. M. Smuc, E. Mayr, T. Lammarsch, W. Aigner, S. Miksch, and J. Gartner. To score or not to score? tripling insights for participatory design. IEEE Transactions on Computer Graphics and Applications, 29:29--38, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  54. M. Smuc, E. Mayr, and H. Risku. Is your user hunting or gathering insights? identifying insight drivers across domains. In Proceedings of the 2010 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 49--54. ACM, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. M. Tory, S. Atkins, A. Kirkpatrick, M. Nicolaou, and G.-Z. Yang. Eyegaze analysis of displays with combined 2D and 3D views. In Visualization, 2005. VIS 05. IEEE, pages 519--526, 2005.Google ScholarGoogle ScholarCross RefCross Ref
  56. M. Tory and T. Möller. Evaluating visualizations: Do expert reviews work? IEEE Transactions on Computer Graphics and Applications, 25(5):8--11, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  57. J. S. Yi, Y. Kang, J. Stasko, and J. Jacko. Toward a deeper understanding of the role of interaction in information visualization. IEEE Transactions on Visualization and Computer Graphics, 13(6):1224--1231, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Towards analyzing eye tracking data for evaluating interactive visualization systems

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Other conferences
          BELIV '14: Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization
          November 2014
          184 pages
          ISBN:9781450332095
          DOI:10.1145/2669557

          Copyright © 2014 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 10 November 2014

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          BELIV '14 Paper Acceptance Rate23of30submissions,77%Overall Acceptance Rate45of64submissions,70%

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader