Skip to main content

What Makes Evaluators to Find More Usability Problems?: A Meta-analysis for Individual Detection Rates

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNPSE,volume 4550))

Abstract

Since many empirical results have been accumulated in usability evaluation research, it would be very useful to provide usability practitioners with generalized guidelines by analyzing the combined results. This study aims at estimating individual detection rate for user-based testing and heuristic evaluation through meta-analysis, and finding significant factors, which affect individual detection rates. Based on the results of 18 user-based testing and heuristic evaluation experiments, individual detection rates in user-based testing and heuristic evaluation were estimated as 0.36 and 0.14, respectively. Expertise and task type were found as significant factors to improve individual detection rate in heuristic evaluation.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Andre, T.S., Hartson, H.R., Williges, R.C.: Determining the effectiveness of the usability problem inspector: a theory-based model and tool for finding usability problems. Human Factors 45, 455–482 (2003)

    Article  Google Scholar 

  2. Baker, K., Greenberg, S., Gutwin, C.: Empirical development of a heuristic evaluation methodology for shared workspace groupware. In: Proceedings of the 2002 ACM Conference on Computer supported cooperative work, pp. 96–105. ACM, New York (2002)

    Chapter  Google Scholar 

  3. Chen, C., Rada, R.: Interacting with hypertext: a meta-analysis of experimental studies. Human-Computer Interaction 11, 125–156 (1996)

    Article  Google Scholar 

  4. Cook, T.D., Leviton, L.C.: Reviewing the literature: a comparison of traditional methods with meta-analysis. Journal of Personality 48, 449–472 (1980)

    Article  Google Scholar 

  5. De Angeli, A., Matera, M., Costabile, M.F., Garzotto, F., Paolini, P.: Validating the SUE inspection technique. In: Di Gesù, V., Levialdi, S., Tarantino, L. (eds.) Proceedings of Advanced Visual Interfaces (AVI’2000), pp. 143–150. ACM, New York (2000)

    Chapter  Google Scholar 

  6. Glass, G.V., McGaw, B., Smith, M.L.: Meta-analysis in social research. Sage Publications, Beverly Hills CA (1981)

    Google Scholar 

  7. Hartson, H.R., Andre, T.S., Williges, R.C.: Criteria for evaluating usability evaluation methods. International Journal of Human-Computer Interaction 15, 145–181 (2003)

    Article  Google Scholar 

  8. Hedges, L.V.: Fixed effects models. In: Cooper, H., Hedges, L.V. (eds.) The handbook of research synthesis, pp. 285–299. Russell Sage Foundation, New York (1994)

    Google Scholar 

  9. Hedges, L.V., Olkin, I.: Statistical methods for meta-analysis. Academic Press, Orlando FL (1985)

    MATH  Google Scholar 

  10. Law, L.-C., Hvannberg, E.T.: Analysis of combinatorial user effect in international usability tests. In: CHI Conference on Human Factors in Computing Systems, pp. 9–16. ACM, New York (2004)

    Chapter  Google Scholar 

  11. Lipsey, M.W., Wilson, D.B.: Practical meta-analysis. SAGE Publications, Thousand Oaks CA (2001)

    Google Scholar 

  12. McLeod, P.L.: An assessment of the experimental literature on electronic support of group work: results of a meta-analysis. Human-Computer Interaction 7, 257–280 (1992)

    Article  Google Scholar 

  13. Nielsen, J.: Finding usability problems through heuristic evaluation. In: CHI Conference on Human Factors in Computing Systems, pp. 373–380. ACM, New York (1992)

    Google Scholar 

  14. Nielsen, J.: Estimating the number of subjects needed for a thinking aloud test. International Journal of Human–Computer Studies 41, 385–397 (1994)

    Article  Google Scholar 

  15. Shadish, W.R., Haddock, C.K.: Combining estimates of effect size. In: Cooper, H., Hedges, L.V. (eds.) The handbook of research synthesis. pp. 261–281. Russell Sage Foundation, New York (1994)

    Google Scholar 

  16. Spool, J., Schroeder, W.: Testing web sites: Five users is nowhere near enough. In: CHI ’01 extended abstracts on Human factors in computing systems, pp. 285–286. ACM, New York (2001)

    Chapter  Google Scholar 

  17. Virzi, R.A.: Streamlining the design process: Running fewer subjects. In: Human Factors Society 34th Annual Meeting. Human Factors and Ergonomics Society, pp. 291–294. Human Factors and Ergonomics Society, Santa Monica CA (1990)

    Google Scholar 

  18. Virzi, R.A.: Refining the test phase of usability evaluation: how many subjects is enough? Human Factors 34, 457–468 (1992)

    Google Scholar 

  19. Zhang, Z., Basili, V., Shneiderman, B.: Perspective-based usability inspection: An empirical validation of efficacy. Empirical Software Engineering 4, 43–69 (1999)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Julie A. Jacko

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Hwang, W., Salvendy, G. (2007). What Makes Evaluators to Find More Usability Problems?: A Meta-analysis for Individual Detection Rates. In: Jacko, J.A. (eds) Human-Computer Interaction. Interaction Design and Usability. HCI 2007. Lecture Notes in Computer Science, vol 4550. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-73105-4_55

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-73105-4_55

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-73104-7

  • Online ISBN: 978-3-540-73105-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics