skip to main content
10.1145/1358628.1358654acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

A comparative evaluation of heuristic-based usability inspection methods

Published:05 April 2008Publication History

ABSTRACT

Given that heuristic evaluation (HE) is a popular evaluation method among practitioners despite criticisms surrounding its performance and reliability, there is a need to improve the method's performance. Several studies have shown HE-Plus, an emerging variant of HE, to outperform HE in both effectiveness and reliability. HE-Plus uses the same set of heuristics as HE; the only difference between these two methods is the 'usability problems profile' element in HE-Plus. This paper reports our attempt to verify the original profile employed in HE-Plus based on usability problem classification in the User Action Framework and an experiment evaluating its outcome by comparing HE with two HE variants using a profile (HE-Plus and HE++) and a control group. Our results confirmed the role of the 'usability problems profiles' on improving the performance and reliability of heuristic evaluation: both HE-Plus and HE++ outperformed HE in terms of effectiveness as well as reliability.

References

  1. AARP Audience-Centered Heuristics: Older Adults. http://www.redish.net/content/handouts/Audience-Centered_Heuristics.pdf.Google ScholarGoogle Scholar
  2. Andre, T. S., Hartson, H.R., Belz, S.M. & McCreary, F.A. The user action framework: a reliable foundation for usability engineering support tools. International Journal of Human-Computer Studies, 54 (2001), 107--136. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Bailey, R. W. Heuristic evaluation vs User testing. UI design update newsletter - January 2001, http://www.humanfactors.com/downloads/jan01.aspGoogle ScholarGoogle Scholar
  4. Chattratichart, J. & Brodie, J. Extending the heuristic evaluation method through contextualisation. In Proceedings of the 46th Annual Meeting of the Human Factors and Ergonomics Society, HFES (2002), 641--645.Google ScholarGoogle ScholarCross RefCross Ref
  5. Chattratichart, J. & Brodie, J. HE-Plus -- Towards usage-centered expert review for website design. In Proc. forUse 2003, MA:Ampersand Press (2003), 155--169.Google ScholarGoogle Scholar
  6. Chattratichart, J. & Brodie, J. Applying User Testing Data to UEM Performance Metrics, In Proc. CHI 2004, ACM Press (2004), 1119--1122. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Gray, W. D., & Salzman, M. C. Damaged merchandise? A review of experiments that compare usability evaluation methods, iHuman-Computer Interaction, 13 (1998), 203--262. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Hartson, H. R., Andre, T. S., & Williges, R. W. Criteria for evaluating usability evaluation methods. International Journal of Human-Computer Interaction, 15(1) (2003), 145--181.Google ScholarGoogle ScholarCross RefCross Ref
  9. Khalayli, N., Nyhus, S., Hamnes, K., Terum, T. Persona based rapid usability kick-off. In Proc. CHI 2007, ACM Press (2007), 1771--1776. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Levi, M. D. & Conrad G. F. A Heuristic Evaluation of a World Wide Web Prototype. http://www.bls.gov/ore/htm_papers/st960160.htm. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Lindgaard, G. & Chattratichart, J. Usability Testing: What Have We Overlooked? In Proc. CHI 2007, ACM Press (2007), 1415--1424. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Lund, A. M. The need for a standardized set of usability metrics. In Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting (pp. 688--691). Santa Monica, CA: Human Factors and Ergonomics Society (1998). Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Molich, R., & Dumas, J. S. Comparative usability evaluation (CUE-4). Behaviour & Information Technology, Taylor & Francis {electronic version} (2006). Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Nielsen, J. Heuristic Evaluation. http://www.useit.com/papers/heuristic/.Google ScholarGoogle Scholar
  15. Nielsen, J. Usability Engineering. SF: Morgan Kaufman (1993). Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Norman, D. A. Cognitive engineering. In D. A. Norman & S. W. Draper (Eds.) User Centered System Design: New Perspectives on Human-Computer Interaction, pp. 31--61. Hillsdale, NJ: Lawrence Erlbaum Associates (1986).Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Perfetti, C. Usability Testing Best Practices: An Interview with Rolf Molich. http://www.webpronews.com/topnews/2003/07/30/usability-testing-best-practices-an-interview-with-rolf-molich.Google ScholarGoogle Scholar
  18. Redish, G., Chisnell, D., & Lee, A. A new take on heuristic evaluation: Bringing personas, tasks, and heuristics together with a new model for understanding older adults as users. http://www.redish.net/content/talks.html.Google ScholarGoogle Scholar
  19. Schaffer, E. Why "how many users" is just the wrong question, UI Design Newsletter -- May 2007: Insights from Human Factors International. http://www.humanfactors.com/downloads/may07.asp.Google ScholarGoogle Scholar
  20. Sears, A. Heuristic Walkthroughs: Finding the problems without the noise. International Journal of Human-Computer Interaction, 9(3), (1997), 213--234.Google ScholarGoogle ScholarCross RefCross Ref
  21. The Webby Awards Judging Criteria. http://www.webbyawards.com/entries/criteria.php.Google ScholarGoogle Scholar

Index Terms

  1. A comparative evaluation of heuristic-based usability inspection methods

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI EA '08: CHI '08 Extended Abstracts on Human Factors in Computing Systems
      April 2008
      2035 pages
      ISBN:9781605580128
      DOI:10.1145/1358628

      Copyright © 2008 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 5 April 2008

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate6,164of23,696submissions,26%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader