skip to main content
10.1145/2669557.2669571acmotherconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Experiences and challenges with evaluation methods in practice: a case study

Published: 10 November 2014 Publication History

Abstract

The development of information visualizations for companies poses specific challenges, especially for evaluation processes. It is advisable to test these visualizations under realistic circumstances. Because of various constraints, this can be quite difficult. In this paper, we discuss three different methods which can be used to conduct evaluations in companies. These methods are appropriate for different stages in the software life cycle (design phase, development, deployment) and reflect an iterative approach in evaluation. Based on an overview of available evaluation methods we argue that this combination of fairly lightweight methods is especially appropriate for evaluations of information visualizations in companies. These methods complement each other and emphasize different aspects of the evaluation. Based on this case study, we try to generalize our lessons learned from our experiences of conducting evaluations in this context.

References

[1]
S. Carpendale. Evaluating information visualizations. In A. Kerren, J. Stasko, J.-D. Fekete, and C. North, editors, Information Visualization, pages 19--45. Springer, 2008.
[2]
C. Courage and K. Baxter. Understanding Your Users: A Practical Guide to User Requirements Methods, Tools, and Techniques. Morgan Kaufmann Publishers Inc., 2004.
[3]
W. Dou, D. H. Jeong, F. Stukes, W. Ribarsky, H. R. Lipford, and R. Chang. Recovering reasoning processes from user interactions. IEEE Comput. Graph. Appl., 29(3):52--61, 2009.
[4]
F. Fischer, J. Davey, J. Fuchs, O. Thonnard, J. Kohlhammer, and D. A. Keim. A visual analytics field experiment to evaluate alternative visualizations for cyber security applications. In Proc. of the EuroVA International Workshop on Visual Analytics, 2014.
[5]
C. M. Freitas, M. S. Pimenta, and D. L. Scapin. User-centered evaluation of information visualization techniques: Making the hci-infovis connection explicit. In W. Huang, editor, Handbook of Human Centric Visualization, pages 315--336. Springer, 2014.
[6]
W. O. Galitz. The Essential Guide to User Interface Design: An Introduction to GUI Design Principles and Techniques. John Wiley & Sons, 2007.
[7]
D. Gotz and M. X. Zhou. Characterizing users' visual analytic activity for insight provenance. Information Visualization, 8(1):42--55, 2009.
[8]
T. Gschwandtner, W. Aigner, S. Miksch, S. Kriglstein, M. Pohl, N. Suchy, and J. Gaertner. TimeCleanser: A visual analytics approach for data cleansing of time-oriented data. In Proc. of the 14th Int. Conf. on Knowledge Management and Knowledge Technologies. ACM, 2014.
[9]
T. Gschwandtner, J. Gärtner, W. Aigner, and S. Miksch. A taxonomy of dirty time-oriented data. In G. Quirchmayr, J. Basl, I. You, L. Xu, and E. Weippl, editors, Multidisciplinary Research and Practice for Information Systems, pages 58--72. Springer, 2012.
[10]
R. Hartson and P. A. Pyla. The UX Book: Process and Guidelines for Ensuring a Quality User Experience. Morgan Kaufmann, 2012.
[11]
T. Isenberg, P. Isenberg, J. Chen, M. Sedlmair, and T. Möller. A systematic review on the practice of evaluating visualization. IEEE Transactions on Visualization and Computer Graphics, 19(12):2818--2827, 2013.
[12]
M. Y. Ivory and M. A. Hearst. The state of the art in automating usability evaluation of user interfaces. ACM Comput. Surv., 33(4):470--516, 2001.
[13]
S. Kandel, R. Parikh, A. Paepcke, J. Hellerstein, and J. Heer. Profiler: Integrated statistical analysis and visualization for data quality assessment. In Proc. of the Int. Working Conf. on Advanced Visual Interfaces, pages 547--554, 2012.
[14]
J. Kitzinger. Qualitative research: Introducing focus groups. BMJ, 311(7000):299--302, 1995.
[15]
S. Kriglstein and G. Wallner. Human centered design in practice: A case study with the ontology visualization tool Knoocks. In G. Csurka, M. Kraus, L. Mestetskiy, P. Richard, and J. Braz, editors, Computer Vision, Imaging and Computer Graphics. Theory and Applications, pages 123--141. Springer, 2013.
[16]
H. Lam, E. Bertini, P. Isenberg, C. Plaisant, and S. Carpendale. Empirical studies in information visualization: Seven scenarios. IEEE Transactions on Visualization and Computer Graphics, 18(9):1520--1536, 2012.
[17]
J. Lazar, J. H. Feng, and H. Hochheiser. Research Methods in Human-Computer Interaction. Wiley Publishing, 2010.
[18]
J. Nielsen. Discount usability: 20 years. http://www.nngroup.com/articles/discount-usability-20-years/. Accessed: June, 2014.
[19]
M. Pohl, S. Wiltner, S. Miksch, W. Aigner, and A. Rind. Analysing interactivity in information visualisation. KI - Künstliche Intelligenz, 26(2):151--159, 2012.
[20]
R. A. Powell and H. M. Single. Focus groups. International Journal for Quality in Health Care, 8(5):499--504, 1996.
[21]
V. Raman and J. M. Hellerstein. Potter's wheel: An interactive data cleaning system. In Proc. of the 27th Int. Conf. on Very Large Data Bases, pages 381--390, 2001.
[22]
Random Developers. OpenRefine. http://openrefine.org/. Accessed: June, 2014.
[23]
J. Rubin and D. Chisnell. Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. Wiley Publ., 2008.
[24]
M. Sedlmair, P. Isenberg, D. Baur, and A. Butz. Evaluating information visualization in large companies: Challenges, experiences and recommendations. In Proc. of the 3rd BELIV'10 Workshop: BEyond Time and Errors: Novel evaLuation Methods for Information Visualization, BELIV '10, pages 79--86. ACM, 2010.
[25]
Y. B. Shrinivasan and J. J. van Wijk. Supporting exploration awareness in information visualization. IEEE Comput. Graph. Appl., 29(5):34--43, 2009.
[26]
C. Snyder. Paper Prototyping: The Fast and Easy Way to Design and Refine User Interfaces. Morgan Kaufmann, 2003.
[27]
Talend. Profiler. http://www.talend.com/. Accessed: June, 2014.
[28]
M. Tory. User studies in visualization: A reflection on methods. In W. Huang, editor, Handbook of Human Centric Visualization, pages 411--426. Springer, 2014.
[29]
C. Ware. Information Visualization: Perception for Design. Morgan Kaufmann Publishers Inc., 2004.
[30]
C. Wilson. User Experience Re-Mastered: Your Guide to Getting the Right Design. Morgan Kaufmann Publishers/Elsevier, 2010.
[31]
XIMES GmbH. Time Intelligence Solutions {TIS}. www.ximes.com/en/software/products/tis/. Accessed: June, 2014.
[32]
Y. Zheng, L. Zhang, X. Xie, and W.-Y. Ma. Mining interesting locations and travel sequences from GPS trajectories. In Proc. of the 18th Int. Conf. on World Wide Web, pages 791--800. ACM, 2009.

Cited By

View all
  • (2018)Visual Interactive Creation, Customization, and Analysis of Data Quality MetricsJournal of Data and Information Quality10.1145/319057810:1(1-26)Online publication date: 29-May-2018
  • (2016)Evaluation Methods in Process-Aware Information Systems Research with a Perspective on Human OrientationBusiness & Information Systems Engineering10.1007/s12599-016-0427-358:6(397-414)Online publication date: 1-Mar-2016

Index Terms

  1. Experiences and challenges with evaluation methods in practice: a case study

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    BELIV '14: Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization
    November 2014
    184 pages
    ISBN:9781450332095
    DOI:10.1145/2669557
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 10 November 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. focus group
    2. log file analysis
    3. paper prototype

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    BELIV '14

    Acceptance Rates

    BELIV '14 Paper Acceptance Rate 23 of 30 submissions, 77%;
    Overall Acceptance Rate 45 of 64 submissions, 70%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)9
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 14 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2018)Visual Interactive Creation, Customization, and Analysis of Data Quality MetricsJournal of Data and Information Quality10.1145/319057810:1(1-26)Online publication date: 29-May-2018
    • (2016)Evaluation Methods in Process-Aware Information Systems Research with a Perspective on Human OrientationBusiness & Information Systems Engineering10.1007/s12599-016-0427-358:6(397-414)Online publication date: 1-Mar-2016

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media