Abstract
In recent years, empirical studies have increasingly been seen as a core part of visualization research, and user evaluations have proliferated. It is broadly understood that new techniques and applications must be formally validated in order to be seen as meaningful contributions. However, these efforts continue to face the numerous challenges involved in validating complex software techniques that exist in a wide variety of use contexts. The authors, who represent perspectives from across visualization research and applications, discuss the leading challenges that must be addressed for empirical research to have the greatest possible impact on visualization in the years to come. These include challenges in developing research questions and hypotheses, designing effective experiments and qualitative methods, and executing studies in specialized domains. We discuss those challenges that have not yet been solved and possible approaches to addressing them. This chapter provides an informal survey and proposes a road map for moving forward to a more cohesive and grounded use of empirical studies in visualization research.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Administrator, F.P.: Fluid project wiki. https://wiki.fluidproject.org
Amar, R., Eagan, J., Stasko, J.: Low-level components of analytic activity in information visualization. In: IEEE Symposium on Information Visualization, 2005. INFOVIS 2005, pp. 111–117. IEEE (2005)
Bertin, J., Berg, W.J., Wainer, H.: Semiology of Graphics: Diagrams, Networks, Maps. University of Wisconsin Press, Madison (1983)
Bezerianos, A., Isenberg, P.: Perception of visual variables on tiled wall-sized displays for information visualization applications. IEEE Trans. Vis. Comput. Graph. 18(12), 2516–2525 (2012)
Brehmer, M., Munzner, T.: A multi-level typology of abstract visualization tasks. IEEE Trans. Vis. Comput. Graph. 19(12), 2376–2385 (2013)
Carpendale, S.: Evaluating information visualizations. Information Visualization, pp. 19–45. Springer, Berlin (2008)
Chen, C., Yu, Y.: Empirical studies of information visualization: a meta-analysis. Int. J. Hum.-Comput. Stud. 53(5), 851–866 (2000)
Cleeremans, A.: The grand challenge for psychology. APS Observer 23(8) (2010)
Collaboration, O.S., et al.: Estimating the reproducibility of psychological science. Science 349(6251), aac4716 (2015)
Crisan, A., Elliott, M.: How to evaluate an evaluation study? comparing and contrasting practices in vis with those of other disciplines. In: Proceedings of the 2018 Workshop on Beyond Time and Errors: Novel Evaluation Methods for Information Visualization. IEEE (2018)
Heer, J., Bostock, M.: Crowdsourcing graphical perception: using mechanical turk to assess visualization design. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 203–212. ACM (2010)
Isenberg, P., Zuk, T., Collins, C., Carpendale, S.: Grounded evaluation of information visualizations. In: Proceedings of the 2008 Workshop on Beyond Time and Errors: Novel Evaluation Methods for Information Visualization, p. 6. ACM (2008)
Isenberg, T., Isenberg, P., Chen, J., Sedlmair, M., Möller, T.: A systematic review on the practice of evaluating visualization. IEEE Trans. Vis. Comput. Graph. 19(12), 2818–2827 (2013)
Johnson, C., Moorhead, R., Munzner, T., Pfister, H., Rheingans, P., Yoo, T.S.: NIH-NSF visualization research challenges report. Institute of Electrical and Electronics Engineers (2005)
Joint Task Force on Computing Curricula, A.f.C.M.A., Society, I.C.: Computer Science Curricula 2013: Curriculum Guidelines for Undergraduate Degree Programs in Computer Science, p. 999133. ACM, New York (2013)
Keim, D.A., Mansmann, F., Schneidewind, J., Ziegler, H.: Challenges in visual data analysis. In: 10th International Conference on Information Visualization, IV 2006, pp. 9–16. IEEE (2006)
Konstanz, U.: Scalable visual analytics: Interactive visual analysis systems of complex information spaces. http://www.visualanalytics.de/node/2
Lam, H., Bertini, E., Isenberg, P., Plaisant, C., Carpendale, S.: Empirical studies in information visualization: seven scenarios. IEEE Trans. Vis. Comput. Graph. 18(9), 1520–1536 (2012)
Munzner, T.: A nested process model for visualization design and validation. IEEE Trans. Vis. Comput. Graph. 15(6), 921–928 (2009)
Narayanan, A., Shmatikov, V.: Myths and fallacies of personally identifiable information. Commun. ACM 53(6), 24–26 (2010)
North, C.: Toward measuring visualization insight. IEEE Comput. Graph. Appl. 26(3), 6–9 (2006)
Plaisant, C.: The challenge of information visualization evaluation. In: Proceedings of the Working Conference on Advanced Visual Interfaces, pp. 109–116. ACM (2004)
Plaisant, C., Fekete, J.D., Grinstein, G.: Promoting insight-based evaluation of visualizations: from contest to benchmark repository. IEEE Trans. Vis. Comput. Graph. 14(1), 120–134 (2008)
Schulz, C., Nocaj, A., El-Assady, M., Frey, S., Hlawatsch, M., Hund, M., Karch, G., Netzel, R., Schätzle, C., Butt, M., et al.: Generative data models for validation and evaluation of visualization techniques. In: Proceedings of the Sixth Workshop on Beyond Time and Errors on Novel Evaluation Methods for Visualization, pp. 112–124. ACM (2016)
Schulz, H.J., Nocke, T., Heitzler, M., Schumann, H.: A design space of visualization tasks. IEEE Trans. Vis. Comput. Graph. 19(12), 2366–2375 (2013)
Sedlmair, M., Isenberg, P., Baur, D., Butz, A.: Information visualization evaluation in large companies: Challenges, experiences and recommendations. Inf. Vis. 10(3), 248–266 (2011)
Shneiderman, B., Plaisant, C.: Strategies for evaluating information visualization tools: multi-dimensional in-depth long-term case studies. In: Proceedings of the 2006 AVI Workshop on Beyond Time and Errors: Novel Evaluation Methods for Information Visualization, pp. 1–7. ACM (2006)
Silva, S., Santos, B.S., Madeira, J.: Using color in visualization: a survey. Comput. Graph. 35(2), 320–333 (2011)
Tory, M., Kirkpatrick, A.E., Atkins, M.S., Moller, T.: Visualization task performance with 2d, 3d, and combination displays. IEEE Trans. Vis. Comput. Graph. 12(1), 2–13 (2006)
Ziemkiewicz, C., Ottley, A., Crouser, R.J., Chauncey, K., Su, S.L., Chang, R.: Understanding visualization by understanding individual users. IEEE Comput. Graph. Appl. 32(6), 88–94 (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Ziemkiewicz, C., Chen, M., Laidlaw, D.H., Preim, B., Weiskopf, D. (2020). Open Challenges in Empirical Visualization Research. In: Chen, M., Hauser, H., Rheingans, P., Scheuermann, G. (eds) Foundations of Data Visualization. Springer, Cham. https://doi.org/10.1007/978-3-030-34444-3_12
Download citation
DOI: https://doi.org/10.1007/978-3-030-34444-3_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-34443-6
Online ISBN: 978-3-030-34444-3
eBook Packages: Computer ScienceComputer Science (R0)