ABSTRACT
Given that heuristic evaluation (HE) is a popular evaluation method among practitioners despite criticisms surrounding its performance and reliability, there is a need to improve the method's performance. Several studies have shown HE-Plus, an emerging variant of HE, to outperform HE in both effectiveness and reliability. HE-Plus uses the same set of heuristics as HE; the only difference between these two methods is the 'usability problems profile' element in HE-Plus. This paper reports our attempt to verify the original profile employed in HE-Plus based on usability problem classification in the User Action Framework and an experiment evaluating its outcome by comparing HE with two HE variants using a profile (HE-Plus and HE++) and a control group. Our results confirmed the role of the 'usability problems profiles' on improving the performance and reliability of heuristic evaluation: both HE-Plus and HE++ outperformed HE in terms of effectiveness as well as reliability.
- AARP Audience-Centered Heuristics: Older Adults. http://www.redish.net/content/handouts/Audience-Centered_Heuristics.pdf.Google Scholar
- Andre, T. S., Hartson, H.R., Belz, S.M. & McCreary, F.A. The user action framework: a reliable foundation for usability engineering support tools. International Journal of Human-Computer Studies, 54 (2001), 107--136. Google ScholarDigital Library
- Bailey, R. W. Heuristic evaluation vs User testing. UI design update newsletter - January 2001, http://www.humanfactors.com/downloads/jan01.aspGoogle Scholar
- Chattratichart, J. & Brodie, J. Extending the heuristic evaluation method through contextualisation. In Proceedings of the 46th Annual Meeting of the Human Factors and Ergonomics Society, HFES (2002), 641--645.Google ScholarCross Ref
- Chattratichart, J. & Brodie, J. HE-Plus -- Towards usage-centered expert review for website design. In Proc. forUse 2003, MA:Ampersand Press (2003), 155--169.Google Scholar
- Chattratichart, J. & Brodie, J. Applying User Testing Data to UEM Performance Metrics, In Proc. CHI 2004, ACM Press (2004), 1119--1122. Google ScholarDigital Library
- Gray, W. D., & Salzman, M. C. Damaged merchandise? A review of experiments that compare usability evaluation methods, iHuman-Computer Interaction, 13 (1998), 203--262. Google ScholarDigital Library
- Hartson, H. R., Andre, T. S., & Williges, R. W. Criteria for evaluating usability evaluation methods. International Journal of Human-Computer Interaction, 15(1) (2003), 145--181.Google ScholarCross Ref
- Khalayli, N., Nyhus, S., Hamnes, K., Terum, T. Persona based rapid usability kick-off. In Proc. CHI 2007, ACM Press (2007), 1771--1776. Google ScholarDigital Library
- Levi, M. D. & Conrad G. F. A Heuristic Evaluation of a World Wide Web Prototype. http://www.bls.gov/ore/htm_papers/st960160.htm. Google ScholarDigital Library
- Lindgaard, G. & Chattratichart, J. Usability Testing: What Have We Overlooked? In Proc. CHI 2007, ACM Press (2007), 1415--1424. Google ScholarDigital Library
- Lund, A. M. The need for a standardized set of usability metrics. In Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting (pp. 688--691). Santa Monica, CA: Human Factors and Ergonomics Society (1998). Google ScholarDigital Library
- Molich, R., & Dumas, J. S. Comparative usability evaluation (CUE-4). Behaviour & Information Technology, Taylor & Francis {electronic version} (2006). Google ScholarDigital Library
- Nielsen, J. Heuristic Evaluation. http://www.useit.com/papers/heuristic/.Google Scholar
- Nielsen, J. Usability Engineering. SF: Morgan Kaufman (1993). Google ScholarDigital Library
- Norman, D. A. Cognitive engineering. In D. A. Norman & S. W. Draper (Eds.) User Centered System Design: New Perspectives on Human-Computer Interaction, pp. 31--61. Hillsdale, NJ: Lawrence Erlbaum Associates (1986).Google ScholarDigital Library
- Perfetti, C. Usability Testing Best Practices: An Interview with Rolf Molich. http://www.webpronews.com/topnews/2003/07/30/usability-testing-best-practices-an-interview-with-rolf-molich.Google Scholar
- Redish, G., Chisnell, D., & Lee, A. A new take on heuristic evaluation: Bringing personas, tasks, and heuristics together with a new model for understanding older adults as users. http://www.redish.net/content/talks.html.Google Scholar
- Schaffer, E. Why "how many users" is just the wrong question, UI Design Newsletter -- May 2007: Insights from Human Factors International. http://www.humanfactors.com/downloads/may07.asp.Google Scholar
- Sears, A. Heuristic Walkthroughs: Finding the problems without the noise. International Journal of Human-Computer Interaction, 9(3), (1997), 213--234.Google ScholarCross Ref
- The Webby Awards Judging Criteria. http://www.webbyawards.com/entries/criteria.php.Google Scholar
Index Terms
- A comparative evaluation of heuristic-based usability inspection methods
Recommendations
Finding usability problems through heuristic evaluation
CHI '92: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsUsability specialists were better than non-specialists at performing heuristic evaluation, and “double experts” with specific expertise in the kind of interface being evaluated performed even better. Major usability problems have a higher probability ...
Usability inspection methods: report on a workshop held at CHI'92, Monterey, CA, May 3–4, 1992
Usability inspection methods, based on informed intuition s about interface design quality, hold promise of providing faster, more cost-effective ways to generate usability evaluations, compared to empirical user evaluation methods . Examples of ...
Heuristic usability evaluation on games: a modular approach
Heuristic evaluation is the preferred method to assess usability in games when experts conduct this evaluation. Many heuristics guidelines have been proposed attending to specificities of games but they only focus on specific subsets of games or ...
Comments