ABSTRACT
Eye tracking can be a suitable evaluation method for determining which regions and objects of a stimulus a human viewer perceived. Analysts can use eye tracking as a complement to other evaluation methods for a more holistic assessment of novel visualization techniques beyond time and error measures. Up to now, most stimuli in eye tracking are either static stimuli or videos. Since interaction is an integral part of visualization, an evaluation should include interaction. In this paper, we present an extensive literature review on evaluation methods for interactive visualizations. Based on the literature review we propose ideas for analyzing eye movement data from interactive stimuli. This requires looking critically at challenges induced by interactive stimuli. The first step is to collect data using different study methods. In our case, we look at using eye tracking, interaction logs, and thinking-aloud protocols. In addition, this requires a thorough synchronization of the mentioned study methods. To analyze the collected data new analysis techniques have to be developed. We investigate existing approaches and how we can adapt them to new data types as well as sketch ideas how new approaches can look like.
- A. Aaltonen, A. Hyrskykari, and K.-J. Räihä. 101 spots, or how do users read menus? In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 132--139. ACM, 1998. Google ScholarDigital Library
- E. W. Anderson. Evaluating visualization using cognitive measures. In Proceedings of the 2012 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 5:1--5:4. ACM, 2012. Google ScholarDigital Library
- K. Andrews. Evaluating information visualization. In Proceedings of the 2006 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 1--5. ACM, 2006. Google ScholarDigital Library
- T. Blascheck, K. Kurzhals, M. Raschke, M. Burch, D. Weiskopf, and T. Ertl. State-of-the-art of visualization for eye tracking data. In EuroVis STAR, pages 63--82, 2014.Google Scholar
- T. Boren and J. Ramey. Thinking aloud: reconciling theory and practice. IEEE Transactions on Professional Communication, 43(3):261--278, 2000.Google ScholarCross Ref
- M. Burch, N. Konevtsova, J. Heinrich, M. Höferlin, and D. Weiskopf. Evaluation of traditional, orthogonal, and radial tree diagrams by an eye tracking study. IEEE Transactions on Visualization and Computer Graphics, 17(12):2440--2448, 2011. Google ScholarDigital Library
- M. D. Byrne, J. R. Anderson, S. Douglass, and M. Matessa. Eye tracking the visual search of click-down menus. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 402--409. ACM, 1999. Google ScholarDigital Library
- S. Carpendale. Evaluating information visualizations. In A. Kerren, J. Stasko, J.-D. Fekete, and C. North, editors, Information Visualization, volume 4950, pages 19--45. Springer, 2008. Google ScholarDigital Library
- M. C. Chen, J. R. Anderson, and M. H. Sohn. What can a mouse cursor tell us more?: Correlation of eye/mouse movements on web browsing. In CHI '01 Extended Abstracts on Human Factors in Computing Systems, pages 281--282. ACM, 2001. Google ScholarDigital Library
- S. Conversy, C. Hurter, and S. Chatty. A descriptive model of visual scanning. In Proceedings of the 2010 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 35--42. ACM, 2010. Google ScholarDigital Library
- A. Duchowski. A breadth-first survey of eye-tracking applications. Behavior Research Method, Instruments, & Computers, 34(4):455--470, 2002.Google Scholar
- C. Ehmke and S. Wilson. Identifying web usability problems from eye-tracking data. In Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI...But Not As We Know It - Volume 1, volume 1, pages 119--128. ACM, 2007. Google ScholarDigital Library
- G. Ellis and A. Dix. An explorative analysis of user evaluation studies in information visualizations. In Proceedings of the 2006 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 1--7. ACM, 2006. Google ScholarDigital Library
- A. Endert and C. North. Interaction junk: User interaction-based evaluation of visual analytic systems. In Proceedings of the 2012 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 7:1--7:3. ACM, 2012. Google ScholarDigital Library
- J. H. Goldberg and J. I. Helfman. Comparing information graphics: A critical look at eye tracking. In Proceedings of the 2010 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 71--78, 2010. Google ScholarDigital Library
- J. H. Goldberg and X. P. Kotval. Computer interface evaluation using eye movements: Methods and constructs. International Journal of Industrial Ergonomics, 24:631--645, 1999.Google ScholarCross Ref
- J. H. Goldberg, M. J. Stimson, M. Lewenstein, N. Scott, and A. M. Wichansky. Eye tracking in web search tasks: Design implications. In Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, pages 51--58. ACM, 2002. Google ScholarDigital Library
- L. A. Granka, T. Joachims, and G. Gay. Eye-tracking analysis of user behavior in www search. In Proceedings of the 27th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, pages 478--479. ACM, 2004. Google ScholarDigital Library
- S. Greenberg and B. Buxton. Usability evaluation considered harmful (some of the time). In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 111--120. ACM, 2008. Google ScholarCross Ref
- D. Groth and B. Murphy. Tracking user interactions within visualizations. In IEEE Symposium on Information Visualization, pages 9--10. IEEE, 2004. Google ScholarDigital Library
- Z. Guan and E. Cutrell. An eye tracking study of the effect of target rank on web search. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 417--420. ACM, 2007. Google ScholarDigital Library
- M. Guzdial, P. Santos, A. Badre, S. Hudson, and M. Gray. Analyzing and visualizing log files: A computational science of usability. Technical report, GVU Center, College of Computing, Georgia Institute of Technology, 1994.Google Scholar
- J. J. Hendrickson. Performance, preference, and visual scan patterns on a menu-based system: Implications for interface design. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 217--222. ACM, 1989. Google ScholarDigital Library
- K. Holmqvist, M. Nyström, R. Andersson, R. Dewhurst, H. Jarodzka, and J. Van de Weijer. Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press, 1 edition, 2011.Google Scholar
- W. Huang. Using eye tracking to investigate graph layout effects. In 6th International Asia-Pacific Symposium on Visualization, pages 97--100. IEEE, 2007.Google ScholarCross Ref
- W. Huang, P. Eades, and S.-H. Hong. Beyond time and error: A cognitive approach to the evaluation of graph drawings. In Proceedings of the 2008 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 3:1--3:8. ACM, 2008. Google ScholarDigital Library
- T. Isenberg, P. Isenberg, J. Chen, M. Sedlmair, and T. Möller. A systematic review on the practice of evaluating visualization. IEEE Transactions on Visualization and Computer Graphics, 19(12):2818--2827, 2013. Google ScholarDigital Library
- M. Y. Ivory and M. A. Hearst. The state of the art in automating usability evaluation of user interfaces. ACM Computing Surveys, 33(4):470--516, 2001. Google ScholarDigital Library
- R. Jacob and K. Karn. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In J. Hyönä, R. Radach, and H. Deubel, editors, The Mind's Eye, pages 573--605. Elsevier Science BV, 2003.Google ScholarCross Ref
- M. A. Just and P. A. Carpenter. A theory of reading: From eye fixations to comprehension. Psychological Review, 87(4):329--354, 1980.Google ScholarCross Ref
- R. Kosara, C. G. Healey, V. Interrante, D. H. Laidlaw, and C. Ware. Thoughts on user studies: Why, how, and when. IEEE Transactions on Computer Graphics and Applications, 23:20--25, 2003. Google ScholarDigital Library
- K. Kurzhals, F. Heimerl, and D. Weiskopf. ISeeCube: Visual analysis of gaze data for video. In Proceedings of the 2014 Symposium on Eye Tracking Research & Applications, pages 43--50, 2014. Google ScholarDigital Library
- H. Lam, E. Bertini, P. Isenberg, C. Plaisant, and S. Carpendale. Empirical studies in information visualization: Seven scenarios. IEEE Transactions on Visualization and Computer Graphics, 18(9):1520--1536, 2012. Google ScholarDigital Library
- H. Lam and T. Munzner. Increasing the utility of quantitative empirical studies for meta-analysis. In Proceedings of the 2008 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 2:1--2:7. ACM, 2008. Google ScholarDigital Library
- J. Lazar, J. Feng, and H. Hochheiser. Research Methods in Human-Computer Interaction. Wiley, 2010. Google ScholarDigital Library
- E. Mayr, M. Smuc, and H. Risku. Many roads lead to rome: Mapping users problem-solving strategies. Information Visualization, 10(3):232--247, 2011. Google ScholarDigital Library
- T. Munzner. A nested model for visualization design and validation. IEEE Transactions on Visualization and Computer Graphics, 15(6):921--928, 2009. Google ScholarDigital Library
- C. North. Toward measuring visualization insight. IEEE Transactions on Computer Graphics and Applications, 26(3):6--9, 2006. Google ScholarDigital Library
- E. Peck, B. Yuksel, L. Harrison, A. Ottley, and R. Chang. Towards a 3-dimensional model of individual cognitive differences. In Proceedings of the 2012 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 6:1--6:6. ACM, 2012. Google ScholarDigital Library
- W. Pike, J. Stasko, R. Chang, and T. O'Connell. The science of interaction. Information Visualization, 8(4):263--274, 2009. Google ScholarDigital Library
- C. Plaisant. The challenge of information visualization evaluation. In Proceedings of the Working Conference on Advanced Visual Interfaces, pages 109--116. ACM, 2004. Google ScholarDigital Library
- M. Pohl. Methodologies for the analysis of usage patterns in information visualization. In Proceedings of the 2012 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 17:1--17:3. ACM, 2012. Google ScholarDigital Library
- M. Pohl, M. Smuc, and E. Mayr. The user puzzle - explaining the interaction with visual analytics systems. IEEE Transactions on Visualization and Computer Graphics, 18(12):2908--2916, 2012. Google ScholarDigital Library
- M. Pohl, S. Wiltner, and S. Miksch. Exploring information visualization: Describing different interaction patterns. In Proceedings of the 2010 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 16--23. ACM, 2010. Google ScholarDigital Library
- A. Poole and L. Ball. Eye tracking in Human-Computer Interaction and Usability Research: Current Status and Future Prospects, chapter E, pages 211--219. Idea Group Inc., 2006.Google Scholar
- C. Privitera and L. Stark. Algorithms for defining visual regions-of-interest: Comparison with eye fixations. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(9):970--982, 2000. Google ScholarDigital Library
- R. W. Reeder, P. Pirolli, and S. K. Card. WebEyeMapper and WebLogger: Tools for analyzing eye tracking data collected in web-use studies. In CHI '01 Extended Abstracts on Human Factors in Computing Systems, pages 19--20. ACM, 2001. Google ScholarDigital Library
- M. Rester and M. Pohl. Methods for the evaluation of an interactive infovis tool supporting exploratory reasoning processes. In Proceedings of the 2006 AVI Workshop on Beyond Time and Errors - Novel Evaluation Methods for Information Visualization, pages 1--6. ACM, 2006. Google ScholarDigital Library
- N. H. Riche. Beyond system logging: Human logging for evaluating information visualization. In Proceedings of the 2010 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization. ACM, 2010.Google Scholar
- P. Saraiya, C. North, and K. Duca. An insight-based methodology for evaluating bioinformatics visualizations. IEEE Transactions on Visualization and Computer Graphics, 11(4):443--456, 2005. Google ScholarDigital Library
- P. Saraiya, C. North, and K. Duca. Comparing benchmark task and insight evaluation methods on timeseries graph visualizations. In Proceedings of the 2010 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 55--62. ACM, 2010. Google ScholarDigital Library
- M. Sedlmair, M. Meyer, and T. Munzner. Design study methodology: Reflections from the trenches and the stacks. IEEE Transactions on Visualization and Computer Graphics, 18(12):2431--2440, 2012.Google ScholarDigital Library
- M. Smuc, E. Mayr, T. Lammarsch, W. Aigner, S. Miksch, and J. Gartner. To score or not to score? tripling insights for participatory design. IEEE Transactions on Computer Graphics and Applications, 29:29--38, 2009. Google ScholarDigital Library
- M. Smuc, E. Mayr, and H. Risku. Is your user hunting or gathering insights? identifying insight drivers across domains. In Proceedings of the 2010 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization, pages 49--54. ACM, 2010. Google ScholarDigital Library
- M. Tory, S. Atkins, A. Kirkpatrick, M. Nicolaou, and G.-Z. Yang. Eyegaze analysis of displays with combined 2D and 3D views. In Visualization, 2005. VIS 05. IEEE, pages 519--526, 2005.Google ScholarCross Ref
- M. Tory and T. Möller. Evaluating visualizations: Do expert reviews work? IEEE Transactions on Computer Graphics and Applications, 25(5):8--11, 2005. Google ScholarDigital Library
- J. S. Yi, Y. Kang, J. Stasko, and J. Jacko. Toward a deeper understanding of the role of interaction in information visualization. IEEE Transactions on Visualization and Computer Graphics, 13(6):1224--1231, 2007. Google ScholarDigital Library
Index Terms
- Towards analyzing eye tracking data for evaluating interactive visualization systems
Recommendations
Benchmark data for evaluating visualization and analysis techniques for eye tracking for video stimuli
BELIV '14: Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for VisualizationFor the analysis of eye movement data, an increasing number of analysis methods have emerged to examine and analyze different aspects of the data. In particular, due to the complex spatio-temporal nature of gaze data for dynamic stimuli, there has been ...
Analysing EOG signal features for the discrimination of eye movements with wearable devices
PETMEI '11: Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interactionEye tracking research in human-computer interaction and experimental psychology traditionally focuses on stationary devices and a small number of common eye movements. The advent of pervasive eye tracking promises new applications, such as eye-based ...
Analyzing the benefits of the combined interaction of head and eye tracking in 3D visualization information
IHC '17: Proceedings of the XVI Brazilian Symposium on Human Factors in Computing SystemsThis work presents an evaluation of the joint interaction of eye tracking and head tracking in a 3D information visualization environment. In this context, it was conducted a task-based evaluation of the interactions, in a prototype using 3D scatter ...
Comments