Skip to main content

Open Challenges in Empirical Visualization Research

  • Chapter
  • First Online:
Foundations of Data Visualization

Abstract

In recent years, empirical studies have increasingly been seen as a core part of visualization research, and user evaluations have proliferated. It is broadly understood that new techniques and applications must be formally validated in order to be seen as meaningful contributions. However, these efforts continue to face the numerous challenges involved in validating complex software techniques that exist in a wide variety of use contexts. The authors, who represent perspectives from across visualization research and applications, discuss the leading challenges that must be addressed for empirical research to have the greatest possible impact on visualization in the years to come. These include challenges in developing research questions and hypotheses, designing effective experiments and qualitative methods, and executing studies in specialized domains. We discuss those challenges that have not yet been solved and possible approaches to addressing them. This chapter provides an informal survey and proposes a road map for moving forward to a more cohesive and grounded use of empirical studies in visualization research.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Administrator, F.P.: Fluid project wiki. https://wiki.fluidproject.org

  2. Amar, R., Eagan, J., Stasko, J.: Low-level components of analytic activity in information visualization. In: IEEE Symposium on Information Visualization, 2005. INFOVIS 2005, pp. 111–117. IEEE (2005)

    Google Scholar 

  3. Bertin, J., Berg, W.J., Wainer, H.: Semiology of Graphics: Diagrams, Networks, Maps. University of Wisconsin Press, Madison (1983)

    Google Scholar 

  4. Bezerianos, A., Isenberg, P.: Perception of visual variables on tiled wall-sized displays for information visualization applications. IEEE Trans. Vis. Comput. Graph. 18(12), 2516–2525 (2012)

    Article  Google Scholar 

  5. Brehmer, M., Munzner, T.: A multi-level typology of abstract visualization tasks. IEEE Trans. Vis. Comput. Graph. 19(12), 2376–2385 (2013)

    Article  Google Scholar 

  6. Carpendale, S.: Evaluating information visualizations. Information Visualization, pp. 19–45. Springer, Berlin (2008)

    Google Scholar 

  7. Chen, C., Yu, Y.: Empirical studies of information visualization: a meta-analysis. Int. J. Hum.-Comput. Stud. 53(5), 851–866 (2000)

    Article  MATH  Google Scholar 

  8. Cleeremans, A.: The grand challenge for psychology. APS Observer 23(8) (2010)

    Google Scholar 

  9. Collaboration, O.S., et al.: Estimating the reproducibility of psychological science. Science 349(6251), aac4716 (2015)

    Google Scholar 

  10. Crisan, A., Elliott, M.: How to evaluate an evaluation study? comparing and contrasting practices in vis with those of other disciplines. In: Proceedings of the 2018 Workshop on Beyond Time and Errors: Novel Evaluation Methods for Information Visualization. IEEE (2018)

    Google Scholar 

  11. Heer, J., Bostock, M.: Crowdsourcing graphical perception: using mechanical turk to assess visualization design. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 203–212. ACM (2010)

    Google Scholar 

  12. Isenberg, P., Zuk, T., Collins, C., Carpendale, S.: Grounded evaluation of information visualizations. In: Proceedings of the 2008 Workshop on Beyond Time and Errors: Novel Evaluation Methods for Information Visualization, p. 6. ACM (2008)

    Google Scholar 

  13. Isenberg, T., Isenberg, P., Chen, J., Sedlmair, M., Möller, T.: A systematic review on the practice of evaluating visualization. IEEE Trans. Vis. Comput. Graph. 19(12), 2818–2827 (2013)

    Article  Google Scholar 

  14. Johnson, C., Moorhead, R., Munzner, T., Pfister, H., Rheingans, P., Yoo, T.S.: NIH-NSF visualization research challenges report. Institute of Electrical and Electronics Engineers (2005)

    Google Scholar 

  15. Joint Task Force on Computing Curricula, A.f.C.M.A., Society, I.C.: Computer Science Curricula 2013: Curriculum Guidelines for Undergraduate Degree Programs in Computer Science, p. 999133. ACM, New York (2013)

    Google Scholar 

  16. Keim, D.A., Mansmann, F., Schneidewind, J., Ziegler, H.: Challenges in visual data analysis. In: 10th International Conference on Information Visualization, IV 2006, pp. 9–16. IEEE (2006)

    Google Scholar 

  17. Konstanz, U.: Scalable visual analytics: Interactive visual analysis systems of complex information spaces. http://www.visualanalytics.de/node/2

  18. Lam, H., Bertini, E., Isenberg, P., Plaisant, C., Carpendale, S.: Empirical studies in information visualization: seven scenarios. IEEE Trans. Vis. Comput. Graph. 18(9), 1520–1536 (2012)

    Article  Google Scholar 

  19. Munzner, T.: A nested process model for visualization design and validation. IEEE Trans. Vis. Comput. Graph. 15(6), 921–928 (2009)

    Article  Google Scholar 

  20. Narayanan, A., Shmatikov, V.: Myths and fallacies of personally identifiable information. Commun. ACM 53(6), 24–26 (2010)

    Article  Google Scholar 

  21. North, C.: Toward measuring visualization insight. IEEE Comput. Graph. Appl. 26(3), 6–9 (2006)

    Article  Google Scholar 

  22. Plaisant, C.: The challenge of information visualization evaluation. In: Proceedings of the Working Conference on Advanced Visual Interfaces, pp. 109–116. ACM (2004)

    Google Scholar 

  23. Plaisant, C., Fekete, J.D., Grinstein, G.: Promoting insight-based evaluation of visualizations: from contest to benchmark repository. IEEE Trans. Vis. Comput. Graph. 14(1), 120–134 (2008)

    Article  Google Scholar 

  24. Schulz, C., Nocaj, A., El-Assady, M., Frey, S., Hlawatsch, M., Hund, M., Karch, G., Netzel, R., Schätzle, C., Butt, M., et al.: Generative data models for validation and evaluation of visualization techniques. In: Proceedings of the Sixth Workshop on Beyond Time and Errors on Novel Evaluation Methods for Visualization, pp. 112–124. ACM (2016)

    Google Scholar 

  25. Schulz, H.J., Nocke, T., Heitzler, M., Schumann, H.: A design space of visualization tasks. IEEE Trans. Vis. Comput. Graph. 19(12), 2366–2375 (2013)

    Article  Google Scholar 

  26. Sedlmair, M., Isenberg, P., Baur, D., Butz, A.: Information visualization evaluation in large companies: Challenges, experiences and recommendations. Inf. Vis. 10(3), 248–266 (2011)

    Article  Google Scholar 

  27. Shneiderman, B., Plaisant, C.: Strategies for evaluating information visualization tools: multi-dimensional in-depth long-term case studies. In: Proceedings of the 2006 AVI Workshop on Beyond Time and Errors: Novel Evaluation Methods for Information Visualization, pp. 1–7. ACM (2006)

    Google Scholar 

  28. Silva, S., Santos, B.S., Madeira, J.: Using color in visualization: a survey. Comput. Graph. 35(2), 320–333 (2011)

    Article  Google Scholar 

  29. Tory, M., Kirkpatrick, A.E., Atkins, M.S., Moller, T.: Visualization task performance with 2d, 3d, and combination displays. IEEE Trans. Vis. Comput. Graph. 12(1), 2–13 (2006)

    Article  Google Scholar 

  30. Ziemkiewicz, C., Ottley, A., Crouser, R.J., Chauncey, K., Su, S.L., Chang, R.: Understanding visualization by understanding individual users. IEEE Comput. Graph. Appl. 32(6), 88–94 (2012)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Caroline Ziemkiewicz .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Ziemkiewicz, C., Chen, M., Laidlaw, D.H., Preim, B., Weiskopf, D. (2020). Open Challenges in Empirical Visualization Research. In: Chen, M., Hauser, H., Rheingans, P., Scheuermann, G. (eds) Foundations of Data Visualization. Springer, Cham. https://doi.org/10.1007/978-3-030-34444-3_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-34444-3_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-34443-6

  • Online ISBN: 978-3-030-34444-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics