Abstract
Heuristic evaluation (HE) is a method designed to help evaluators detect usability issues in any given system. It has gained popularity since it is a discount method, meaning it does not require much time or resources. However, it has been reported that novice evaluators face difficulties when applying HE, and they produce results of poor quality when compared to the results produced by expert evaluators. For years, researchers have worked on improving HE in multiple ways by producing different heuristics, modifying and extending existing heuristics, or addressing and improving certain parts of the HE process. In this work, we provide a set of questions based on Nielsen’s heuristics for evaluators to ask themselves while examining the system. This list of questions can facilitate the detection of usability problems in any given system. The list is a result of interviews we conducted with 15 usability experts from both academia and industry.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Nielsen, J.: Guerrilla HCI: using discount usability engineering to penetrate the intimidation barrier. In: Bias, R.G., Mayhew, D.J. (eds.) Cost-Justifying Usability, pp. 245–272. Academic Press, Boston (1994)
Rosenbaum, S., Rohn, J.A., Humburg, J.: A toolkit for strategic usability: results from workshops, panels, and surveys. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 337–344. Association for Computing Machinery, The Hague (2000). https://doi.org/10.1145/332040.332454
Cockton, G., Woolrych, A.: Sale must end: should discount methods be cleared off HCI’s shelves? Interactions 9, 13–18 (2002). https://doi.org/10.1145/566981.566990
de Salgado, A.L., Amaral, L.A., Freire, A.P., Fortes, R.P.M.: Usability and UX practices in small enterprises: lessons from a survey of the Brazilian context. In: Proceedings of the 34th ACM International Conference on the Design of Communication, pp. Article 18. Association for Computing Machinery, Silver Spring (2016). https://doi.org/10.1145/2987592.2987616
Al-Razgan, M.S., Al-Khalifa, H.S., Al-Shahrani, M.D.: Heuristics for evaluating the usability of mobile launchers for elderly people. In: Marcus, A. (ed.) DUXU 2014. LNCS, vol. 8517, pp. 415–424. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-07668-3_40
Paz, F., Paz, F.A., Pow-Sang, J.A.: Experimental case study of new usability heuristics. In: Marcus, A. (ed.) Design, User Experience, and Usability: Design Discourse, pp. 212–223. Springer, Cham (2015)
Abulfaraj, A., Steele, A.: Coherent heuristic evaluation (CoHE): toward increasing the effectiveness of heuristic evaluation for novice evaluators. In: Marcus, A., Rosenzweig, E. (eds.) HCII 2020. LNCS, vol. 12200, pp. 3–20. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-49713-2_1
Abulfaraj, A., Steele, A.: Detailed usability heuristics: a breakdown of usability heuristics to enhance comprehension for novice evaluators. In: Stephanidis, C., Marcus, A., Rosenzweig, E., Rau, P.L.P., Moallem, A., Rauterberg, M. (eds.) HCI International 2020 - Late Breaking Papers: User Experience Design and Case Studies, pp. 3–18. Springer, Cham (2020)
Ann, B., Dominic, F., Stephann, M.: Qualitative HCI research: going behind the scenes. Synth. Lect. Hum. Centered Inform. 9, 1–115 (2016). https://doi.org/10.2200/S00706ED1V01Y201602HCI034
Molich, R., Nielsen, J.: Improving a human-computer dialogue. Commun. ACM 33, 338–348 (1990). https://doi.org/10.1145/77481.77486
Nielsen, J., Molich, R.: Heuristic evaluation of user interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 249–256. Association for Computing Machinery, Seattle (1990). https://doi.org/10.1145/97243.97281
Nielsen, J.: Finding usability problems through heuristic evaluation. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 373–380. Association for Computing Machinery, Monterey (1992). https://doi.org/10.1145/142750.142834
Slavkovic, A., Cross, K.: Novice heuristic evaluations of a complex interface. In: CHI 1999 Extended Abstracts on Human Factors in Computing Systems, pp. 304–305. Association for Computing Machinery, Pittsburgh (1999). https://doi.org/10.1145/632716.632902
Tognazzini, B.: First principles of interaction design (revised & expanded) (2014). https://asktog.com/atc/principles-of-interaction-design/
Gerhardt-Powals, J.: Cognitive engineering principles for enhancing human-computer performance. Int. J. Hum. Comput. Interact. 8, 189–211 (1996). https://doi.org/10.1080/10447319609526147
Alsumait, A., Al-Osaimi, A.: Usability heuristics evaluation for child e-learning applications. In: Proceedings of the 11th International Conference on Information Integration and Web-based Applications & Services, pp. 425–430. Association for Computing Machinery, Kuala Lumpur (2009). https://doi.org/10.1145/1806338.1806417
Chanco, C., Moquillaza, A., Paz, F.: Development and validation of usability heuristics for evaluation of interfaces in ATMs. In: Marcus, A., Wang, W. (eds.) HCII 2019. LNCS, vol. 11586, pp. 3–18. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-23535-2_1
Granollers, T.: Usability evaluation with heuristics. new proposal from integrating two trusted sources. In: Marcus, A., Wang, W. (eds.) DUXU 2018. LNCS, vol. 10918, pp. 396–405. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-91797-9_28
Wheeler Atkinson, B.F., Bennett, T.O., Bahr, G.S., Walwanis Nelson, M.M.: Development of a multiple heuristics evaluation table (MHET) to support software development and usability analysis. In: Stephanidis, C. (ed.) UAHCI 2007. LNCS, vol. 4554, pp. 563–572. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-73279-2_63
Hvannberg, E.T., Law, E.L.C., Lárusdóttir, M.K.: Heuristic evaluation: comparing ways of finding and reporting usability problems. Interact. Comput. 19, 225–240 (2007). https://doi.org/10.1016/j.intcom.2006.10.001
Sohl, M.: Comparing Two Heuristic Evaluation Methods and Validating with Usability Test Methods: Applying Usability Evaluation on a Simple Website. Student Thesis. Linköping University, Linköping, Sweden (2018)
Cronholm, S.: The usability of usability guidelines: a proposal for meta-guidelines. In: Proceedings of the 21st Annual Conference of the Australian Computer-Human Interaction Special Interest Group: Design: Open 24/7, pp. 233–240. Association for Computing Machinery, Melbourne (2009). https://doi.org/10.1145/1738826.1738864
de Salgado, A.L., de Lara, S.M., Freire, A.P., de Fortes, R.P.M.: What is hidden in a heuristic evaluation: tactics from the experts. In: 13th International Conference on Information Systems & Technology Management - Contecsi, pp. 2931–2946. Contecsi, São Paulo (2016)
Cockton, G., Woolrych, A., Hall, L., Hindmarch, M.: Changing analysts’ tunes: the surprising impact of a new instrument for usability inspection method assessment. In: O’Neill, E., Palanque, P., Johnson, P. (eds.) People and Computers XVII — Designing for Society, pp. 145–161. Springer, London (2004)
Guest, G., Bunce, A., Johnson, L.: How many interviews are enough? An experiment with data saturation and variability. Field Methods 18, 59–82 (2006). https://doi.org/10.1177/1525822X05279903
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Abulfaraj, A., Steele, A. (2021). Operational Usability Heuristics: A Question-Based Approach for Facilitating the Detection of Usability Problems. In: Soares, M.M., Rosenzweig, E., Marcus, A. (eds) Design, User Experience, and Usability: UX Research and Design. HCII 2021. Lecture Notes in Computer Science(), vol 12779. Springer, Cham. https://doi.org/10.1007/978-3-030-78221-4_12
Download citation
DOI: https://doi.org/10.1007/978-3-030-78221-4_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-78220-7
Online ISBN: 978-3-030-78221-4
eBook Packages: Computer ScienceComputer Science (R0)