skip to main content
10.1145/2110192.2110197acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Learning-based evaluation of visual analytic systems

Published: 10 April 2010 Publication History

Abstract

Evaluation in visualization remains a difficult problem because of the unique constraints and opportunities inherent to visualization use. While many potentially useful methodologies have been proposed, there remain significant gaps in assessing the value of the open-ended exploration and complex task-solving that the visualization community holds up as an ideal. In this paper, we propose a methodology to quantitatively evaluate a visual analytics (VA) system based on measuring what is learned by its users as the users reapply the knowledge to a different problem or domain. The motivation for this methodology is based on the observation that the ultimate goal of a user of a VA system is to gain knowledge of and expertise with the dataset, task, or tool itself. We propose a framework for describing and measuring knowledge gain in the analytical process based on these three types of knowledge and discuss considerations for evaluating each. We propose that through careful design of tests that examine how well participants can reapply knowledge learned from using a VA system, the utility of the visualization can be more directly assessed.

References

[1]
Chen, M., Ebert, D., Hagen, H., Laramee, R. S., van Liere, R., Ma, K., Ribarsky, W., Scheuermann, G., and Silver, D. 2009. Data, Information, and Knowledge in Visualization. IEEE Comput. Graph. Appl. 29, 1 (Jan. 2009), 12--19.
[2]
Chang, R., Ziemkiewicz, C., Green T. M., and Ribarsky, W. 2009. Defining insight for visual analytics. IEEE Computer Graphics and Applications, 29(2):14--17.
[3]
Dou, W., Jeong, D. H., Stukes, F., Ribarsky, W., Richter Lipford, H., Chang, R. 2009. Recovering Reasoning Processes from User Interactions. IEEE Computer Graphics and Applications, vol. 29, no. 3, pp. 52--61.
[4]
Hilbert, D. M. and Redmiles, D. F. 2000. Extracting usability information from user interface events. ACM Comput. Surv. 32, 4 (Dec. 2000), 384--421.
[5]
Isenberg, P., Zuk, T., Collins, C., and Carpendale, S. 2008. Grounded evaluation of information visualization. In Proceedings of the 2008 Conference on Beyond Time and Errors: Novel Evaluation Methods For Information Visualization (Florence, Italy, April 05, 2008). BELIV '08. ACM, New York, NY.
[6]
McCormick, B. H., DeFanti, T. A., and Brown, M. D. (eds). 1987. Visualization in Scientific Computing. ACM SIGGRAPH, New York, NY.
[7]
North, C. 2006. Toward measuring visualization insight. IEEE Computer Graphics and Applications, 26(3):6--9.
[8]
Perer, P., and Shneiderman, B. Integrating Statistics and Visualization: Case Studies of Gaining Clarity During Exploratory Data Analysis. ACM Conference on Human Factors in Computing Systems (CHI 2008). Florence, Italy. (2008).
[9]
Robertson, G. 2008. Beyond time and errors -- position statement. In Proceedings of the 2008 Conference on Beyond Time and Errors: Novel Evaluation Methods For Information Visualization (Florence, Italy, April 05, 2008). BELIV '08. ACM, New York, NY.
[10]
Saraiya, P., North, C., and Duca, K. 2005. An insight-based methodology for evaluating bioinformatics visualizations. IEEE Transactions on Visualization and Computer Graphics, 11(4):443--456.
[11]
Scholtz, J. 2008. Progress and Challenges in Evaluating Tools for Sensemaking. ACM Computer Human Information (CHI) conference Workshop on Sensemaking in Florence, Italy, April 6, 2008.
[12]
Shneiderman, B. and Plaisant, C. 2006. Strategies for evaluating information visualization tools: multi-dimensional in-depth long-term case studies. In Proceedings of the 2006 AVI Workshop on Beyond Time and Errors: Novel Evaluation Methods For Information Visualization (Venice, Italy, May 23, 2006). BELIV '06. ACM, New York, NY.
[13]
Thomas, J. J. and Cook, K. A. (eds). 2005. Illuminating the Path: Research and Development Agenda for Visual Analytics. IEEE.
[14]
van Wijk, J. J. 2005. The value of visualization. Visualization. VIS 05. IEEE, 79--86.

Cited By

View all
  • (2022)Learning Objectives, Insights, and Assessments: How Specification Formats Impact DesignIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2021.311481128:1(676-685)Online publication date: 1-Jan-2022
  • (2016)A Language and a SpaceDeveloping Effective Educational Experiences through Learning Analytics10.4018/978-1-4666-9983-0.ch001(1-41)Online publication date: 2016
  • (2016)Knowledge-Assisted Ranking: A Visual Analytic Application for Sports Event DataIEEE Computer Graphics and Applications10.1109/MCG.2015.2536:3(72-82)Online publication date: May-2016
  • Show More Cited By

Index Terms

  1. Learning-based evaluation of visual analytic systems

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    BELIV '10: Proceedings of the 3rd BELIV'10 Workshop: BEyond time and errors: novel evaLuation methods for Information Visualization
    April 2010
    92 pages
    ISBN:9781450300070
    DOI:10.1145/2110192
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 10 April 2010

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. evaluation methodology
    2. learning
    3. visualization

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    CHI '10
    Sponsor:

    Acceptance Rates

    BELIV '10 Paper Acceptance Rate 12 of 18 submissions, 67%;
    Overall Acceptance Rate 45 of 64 submissions, 70%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)3
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 20 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2022)Learning Objectives, Insights, and Assessments: How Specification Formats Impact DesignIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2021.311481128:1(676-685)Online publication date: 1-Jan-2022
    • (2016)A Language and a SpaceDeveloping Effective Educational Experiences through Learning Analytics10.4018/978-1-4666-9983-0.ch001(1-41)Online publication date: 2016
    • (2016)Knowledge-Assisted Ranking: A Visual Analytic Application for Sports Event DataIEEE Computer Graphics and Applications10.1109/MCG.2015.2536:3(72-82)Online publication date: May-2016
    • (2014)Evaluating Graphs from a New PerspectiveProceedings of the 2014 IEEE 17th International Conference on Computational Science and Engineering10.1109/CSE.2014.303(1648-1652)Online publication date: 19-Dec-2014
    • (2012)Why ask why?Proceedings of the 2012 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization10.1145/2442576.2442586(1-3)Online publication date: 14-Oct-2012

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media