skip to main content
10.1145/2669557.2669580acmotherconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Toward visualization-specific heuristic evaluation

Published:10 November 2014Publication History

ABSTRACT

This position paper describes heuristic evaluation as it relates to visualization and visual analytics. We review heuristic evaluation in general, then comment on previous process-based, performance-based, and framework-based efforts to adapt the method to visualization-specific needs. We postulate that the framework-based approach holds the most promise for future progress in development of visualization-specific heuristics, and propose a specific framework as a starting point. We then recommend a method for community involvement and input into the further development of the heuristic framework and more detailed design and evaluation guidelines.

References

  1. S. Carpendale. Evaluating information visualizations. In A. Kerren, J. Stasko, J.-D. Fekete, and C. North, editors, Information Visualization, pages 19--45. Springer, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. C. Courage and K. Baxter. Understanding Your Users: A Practical Guide to User Requirements Methods, Tools, and Techniques. Morgan Kaufmann Publishers Inc., 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. W. Dou, D. H. Jeong, F. Stukes, W. Ribarsky, H. R. Lipford, and R. Chang. Recovering reasoning processes from user interactions. IEEE Comput. Graph. Appl., 29(3):52--61, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. F. Fischer, J. Davey, J. Fuchs, O. Thonnard, J. Kohlhammer, and D. A. Keim. A visual analytics field experiment to evaluate alternative visualizations for cyber security applications. In Proc. of the EuroVA International Workshop on Visual Analytics, 2014.Google ScholarGoogle Scholar
  5. C. M. Freitas, M. S. Pimenta, and D. L. Scapin. User-centered evaluation of information visualization techniques: Making the hci-infovis connection explicit. In W. Huang, editor, Handbook of Human Centric Visualization, pages 315--336. Springer, 2014.Google ScholarGoogle ScholarCross RefCross Ref
  6. W. O. Galitz. The Essential Guide to User Interface Design: An Introduction to GUI Design Principles and Techniques. John Wiley & Sons, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. D. Gotz and M. X. Zhou. Characterizing users' visual analytic activity for insight provenance. Information Visualization, 8(1):42--55, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. T. Gschwandtner, W. Aigner, S. Miksch, S. Kriglstein, M. Pohl, N. Suchy, and J. Gaertner. TimeCleanser: A visual analytics approach for data cleansing of time-oriented data. In Proc. of the 14th Int. Conf. on Knowledge Management and Knowledge Technologies. ACM, 2014.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. T. Gschwandtner, J. Gärtner, W. Aigner, and S. Miksch. A taxonomy of dirty time-oriented data. In G. Quirchmayr, J. Basl, I. You, L. Xu, and E. Weippl, editors, Multidisciplinary Research and Practice for Information Systems, pages 58--72. Springer, 2012.Google ScholarGoogle ScholarCross RefCross Ref
  10. R. Hartson and P. A. Pyla. The UX Book: Process and Guidelines for Ensuring a Quality User Experience. Morgan Kaufmann, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. T. Isenberg, P. Isenberg, J. Chen, M. Sedlmair, and T. Möller. A systematic review on the practice of evaluating visualization. IEEE Transactions on Visualization and Computer Graphics, 19(12):2818--2827, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. M. Y. Ivory and M. A. Hearst. The state of the art in automating usability evaluation of user interfaces. ACM Comput. Surv., 33(4):470--516, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. S. Kandel, R. Parikh, A. Paepcke, J. Hellerstein, and J. Heer. Profiler: Integrated statistical analysis and visualization for data quality assessment. In Proc. of the Int. Working Conf. on Advanced Visual Interfaces, pages 547--554, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. J. Kitzinger. Qualitative research: Introducing focus groups. BMJ, 311(7000):299--302, 1995.Google ScholarGoogle ScholarCross RefCross Ref
  15. S. Kriglstein and G. Wallner. Human centered design in practice: A case study with the ontology visualization tool Knoocks. In G. Csurka, M. Kraus, L. Mestetskiy, P. Richard, and J. Braz, editors, Computer Vision, Imaging and Computer Graphics. Theory and Applications, pages 123--141. Springer, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  16. H. Lam, E. Bertini, P. Isenberg, C. Plaisant, and S. Carpendale. Empirical studies in information visualization: Seven scenarios. IEEE Transactions on Visualization and Computer Graphics, 18(9):1520--1536, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. J. Lazar, J. H. Feng, and H. Hochheiser. Research Methods in Human-Computer Interaction. Wiley Publishing, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. J. Nielsen. Discount usability: 20 years. http://www.nngroup.com/articles/discount-usability-20-years/. Accessed: June, 2014.Google ScholarGoogle Scholar
  19. M. Pohl, S. Wiltner, S. Miksch, W. Aigner, and A. Rind. Analysing interactivity in information visualisation. KI - Künstliche Intelligenz, 26(2):151--159, 2012.Google ScholarGoogle Scholar
  20. R. A. Powell and H. M. Single. Focus groups. International Journal for Quality in Health Care, 8(5):499--504, 1996.Google ScholarGoogle ScholarCross RefCross Ref
  21. V. Raman and J. M. Hellerstein. Potter's wheel: An interactive data cleaning system. In Proc. of the 27th Int. Conf. on Very Large Data Bases, pages 381--390, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Random Developers. OpenRefine. http://openrefine.org/. Accessed: June, 2014.Google ScholarGoogle Scholar
  23. J. Rubin and D. Chisnell. Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. Wiley Publ., 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. M. Sedlmair, P. Isenberg, D. Baur, and A. Butz. Evaluating information visualization in large companies: Challenges, experiences and recommendations. In Proc. of the 3rd BELIV'10 Workshop: BEyond Time and Errors: Novel evaLuation Methods for Information Visualization, BELIV '10, pages 79--86. ACM, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Y. B. Shrinivasan and J. J. van Wijk. Supporting exploration awareness in information visualization. IEEE Comput. Graph. Appl., 29(5):34--43, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. C. Snyder. Paper Prototyping: The Fast and Easy Way to Design and Refine User Interfaces. Morgan Kaufmann, 2003.Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Talend. Profiler. http://www.talend.com/. Accessed: June, 2014.Google ScholarGoogle Scholar
  28. M. Tory. User studies in visualization: A reflection on methods. In W. Huang, editor, Handbook of Human Centric Visualization, pages 411--426. Springer, 2014.Google ScholarGoogle ScholarCross RefCross Ref
  29. C. Ware. Information Visualization: Perception for Design. Morgan Kaufmann Publishers Inc., 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. C. Wilson. User Experience Re-Mastered: Your Guide to Getting the Right Design. Morgan Kaufmann Publishers/Elsevier, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. XIMES GmbH. Time Intelligence Solutions {TIS}. www.ximes.com/en/software/products/tis/. Accessed: June, 2014.Google ScholarGoogle Scholar
  32. Y. Zheng, L. Zhang, X. Xie, and W.-Y. Ma. Mining interesting locations and travel sequences from GPS trajectories. In Proc. of the 18th Int. Conf. on World Wide Web, pages 791--800. ACM, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Toward visualization-specific heuristic evaluation

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      BELIV '14: Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization
      November 2014
      184 pages
      ISBN:9781450332095
      DOI:10.1145/2669557

      Copyright © 2014 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 10 November 2014

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      BELIV '14 Paper Acceptance Rate23of30submissions,77%Overall Acceptance Rate45of64submissions,70%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader