skip to main content
10.1145/1240866.1240918acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

Coming to terms: comparing and combining the results of multiple evaluators performing heuristic evaluation

Published:28 April 2007Publication History

ABSTRACT

In this paper we describe a new way to perform heuristic evaluations, which allows multiple evaluators to easily compare and combine the results of their reviews. This method was developed to provide a single, reliable, result to the client, but it also allowed us to easily negotiate differences in our findings, and to prioritize usability problems identified by the evaluation. An unexpected side effect is that, by using this evaluation method, the practitioner can measure and predict the effect of usability improvements.

References

  1. Cockton, G., Lavery, D., Woolrych, A. Inspection--Based Evaluations. In Jacko, J. A., & Sears, A. (Eds) (2003), The Human--Computer Interaction Handbook (pp. 1118--1137). Mahwah, NJ: Lawrence Erlbaum Associates. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Dumas, J. & Redish G. (1999). A practical guide to usability testing (Revised Ed.) (pp. 322--326). London: Intellect Books. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Hertzum, M., Jacobsen, N. E., & Molich, R. (2002). Usability Inspection by groups of specialists: perceived agreement in spite of disparate observations. Proceeding of CHI 2002, (pp. 662). Minneapolis: ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Jacobsen, N. E., Hertzum, M., & John B. E. (1998). The evaluator effect in usability studies: Problem detection and severity judgments, Proceedings of the Human Factors and Ergonomics Society (pp. 1336--1340).Google ScholarGoogle ScholarCross RefCross Ref
  5. Molich, R. & Nielsen, J. (1990). Improving a human--computer dialogue: what designers know about traditional interface design. Communications of the ACM 33, 3. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Nielson, J. (1994), Heuristic evaluation, in Nielsen, J. & Mack, R. (Eds.) Usability inspection methods, (pp. 25--62), NY: John Wiley and Sons, Inc. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Nielsen, J. (1993). Usability engineering (pp. 103--104). San Diego: Morgan Kauffman. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Nielsen, J., & Molich, R. (1990). Heuristic evaluation of user interfaces, Proceeding of CHI '90 (pp. 241--256). New York: ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Nielsen, J., & Landauer, T. K. (1993). A mathematical model of the finding of usability problems, Proceedings ofINTERCHI '93 (pp. 206--213). Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Rubin, J. (1994). Handbook of usability testing (pp. 277--280). Portland: John Wiley and Sons.Google ScholarGoogle Scholar
  11. Stanton, N., & Baber, C. (1996), Factors affecting the selection of methods and techniques prior to conducting a usability evaluation, in Jordan, P. W., Thomas, B., Weerdmeester, B. A, & McClelland, I. L. (Eds.) Usability evaluation in industry, (pp. 39--48), Bristol, PA: Taylor & Francis.Google ScholarGoogle Scholar
  12. Wilson, C. (1999). Reader's questions: Severity scales. Usability Interface, Volume 5, Number 4.Google ScholarGoogle Scholar
  13. Wilson, C. & Coyne, K. (2001). Tracking usability issues: To bug or not to bug? Interactions (May--June). Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Yahoo UI Design Library, Retrieved October 19th, 2006 from http://developer.yahoo.com/ypatterns/pattern.php?pattern=ratinganobjectGoogle ScholarGoogle Scholar

Index Terms

  1. Coming to terms: comparing and combining the results of multiple evaluators performing heuristic evaluation

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI EA '07: CHI '07 Extended Abstracts on Human Factors in Computing Systems
      April 2007
      1286 pages
      ISBN:9781595936424
      DOI:10.1145/1240866

      Copyright © 2007 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 28 April 2007

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • Article

      Acceptance Rates

      CHI EA '07 Paper Acceptance Rate212of582submissions,36%Overall Acceptance Rate6,164of23,696submissions,26%

      Upcoming Conference

      CHI '24
      CHI Conference on Human Factors in Computing Systems
      May 11 - 16, 2024
      Honolulu , HI , USA

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader