ABSTRACT
Remote asynchronous usability testing involves users directly in reporting usability problems. Most studies of this approach employ predefined tasks to ensure that users experience specific aspects of the system, whereas other studies use no task assignments. Yet the effect of using predefined tasks is still to be uncovered. There is also limited research on instructions for users in identifying usability problems. This paper reports from a comparative study of the effect of task assignments and instruction types on the problems identified in remote asynchronous usability testing of a website for information retrieval, involving 53 prospective users. The results show that users solving predefined tasks identified significantly more usability problems with a significantly higher level of agreement than those working on their own authentic tasks. Moreover, users that were instructed by means of examples of usability problems identified significantly more usability problems than those who received a conceptual definition of usability problems.
- Äijö, R. and Mantere, J. Are Non-Expert Usability Evaluations Valuable? http://www.hft.org/HFT01/paper01/acceptance/2_01.pdfGoogle Scholar
- Andreasen, M. S., Nielsen, H. V., Schrøder, S. O. and Stage, J. What happened to remote usability testing? An empirical study of three methods. In proc. CHI 2007, ACM Press (2007), 1405--1414. Google ScholarDigital Library
- Bosenick, T., Kehr, S., Kühn, M. and Nufer, S. Remote usability tests: an extension of the usability toolbox for online-shops. In Proc. UAHCI 2007, Springer-Verlag (2007), 392--398. Google ScholarDigital Library
- Brush, A. B., Ames, M. and Davis, J. A comparison of synchronous remote and local usability studies for an expert interface. In proc. CHI 2004, ACM Press (2004), 1179--1182. Google ScholarDigital Library
- Bruun, A., Gull, P., Hofmeister, L. and Stage, J. Let Your Users Do the Testing: A Comparison of Three Asynchronous Usability Testing Methods. In proc. CHI 2009, ACM Press (2009), 1619--1628. Google ScholarDigital Library
- Castillo, J. C. The User-Reported Critical Incident Method for Remote Usability Evaluation. Master thesis, Virginia Polytechnic Institute and State University (1997).Google Scholar
- Castillo, J. C., Hartson, H. R. and Hix, D. Remote usability evaluation: Can users report their own critical incidents? In proc. CHI 1998, ACM Press (1998), 253--254. Google ScholarDigital Library
- Felder, R. M. Reaching the second tier: Learning and teaching styles in college science education. College Science Teaching 23, 5 (1993), 286--290.Google Scholar
- Felder, R. M. and Silverman, L. K. Learning and Teaching Styles in Engineering Education. Engineering. Education 78, 7 (1988), 674--681.Google Scholar
- Fleiss, J. L. Statistical methods for rates and proportions (2nd ed.). John Wiley & Sons, New York, 1981.Google Scholar
- Følstad, A. and Hornbæk, K. Work-domain knowledge in usability evaluation: Experiences with Cooperative Usability Testing. Journal of Systems and Software 83, 11, (2010), 2019--2030. Google ScholarDigital Library
- Hartson, H. R. and Castillo, J. C. Remote evaluation for post-deployment usability improvement. In proc. AVI 1998, 22--29. Google ScholarDigital Library
- Hartson, H. R., Castillo, J. C., Kelso, J. and Neale, W. C. Remote evaluation: The network as an extension of the usability laboratory. Proceedings of CHI 1996, ACM Press (1996), 228--235. Google ScholarDigital Library
- Hertzum, M. and Jacobsen, N. E. The Evaluator Effect: A Chilling Fact about Usability Evaluation Methods. Human Computer Interaction 15, 1 (2003), 1336--1340.Google Scholar
- Hilbert, D. M. and Redmiles, D. F. Separating the Wheat from the Chaff in Internet-Mediated User Feedback Expectation-Driven Event Monitoring. ACM SIGGROUP Bulletin 20, 1, (1999), 35--40. Google ScholarDigital Library
- Hornbæk, K. and Frøkjær, E. Making Use of Business Goals in Usability Evaluation: An Experiment with Novice Evaluators. In proc. CHI 2008, ACM Press (2008), 903--911. Google ScholarDigital Library
- Hwang, W. and Salvendy, G. Number of people required for usability evaluation: the 10-2 rule. Commun. ACM 53, 5 (May 2010), 130--133. Google ScholarDigital Library
- Kjaer, A., Madsen, K. H. and Petersen, M. G.. Methodological Challenges in the Study of Technology Use at Home. In proc. HOIT 2000, Kluwer Academic Publishers (2000), 45--60. Google ScholarDigital Library
- Kjeldskov, J., Skov, M. B. & Stage, J. Instant Data Analysis: Evaluating Usability in a Day. In proc. NordiCHI 2004, ACM Press (2004), 233--240. Google ScholarDigital Library
- Lindgaard, G. and Chattratichart, J. Usability Testing: What Have We Overlooked? In proc. CHI 2007, ACM Press (2007), 1415--1424. Google ScholarDigital Library
- Marsh, S. L., Dykes, J. and Attilakou, F. Evaluating a geovisualization prototype with two approaches: remote instructional vs. face-to-face exploratory. In proc. Information Visualization 2006, IEEE (2006), 310--315. Google ScholarDigital Library
- Molich, R. Usable Web Design. Nyt Teknisk Forlag, Odense, Denmark, 2007.Google Scholar
- Nielsen, C. M., Overgaard, M., Pedersen, M. B., Stage, J. and Stenild, S. It's Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the Field. In proc. NordiCHI 2006, ACM Press (2006), 272--280. Google ScholarDigital Library
- Prince, M. J. and Felder, R. M. Inductive teaching and learning methods: Definitions, comparisons, and research bases. Engineering Education 95 (2006), 123--138.Google ScholarCross Ref
- Rubin, J. and Chisnell, D. Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. Wiley Publishing, Indianapolis, USA, 2008. Google ScholarDigital Library
- Scholtz, J. A case study: developing a remote, rapid and automated usability testing methodology for on-line books. In proc. HICSS 1999, IEEE (1999). Google ScholarDigital Library
- Ssemugabi, S. and Villiers, R. D. A comparative study of two usability evaluation methods using a web-based elearning application. In proc. SAICSIT 2007, ACM Press (2007), 132--142. Google ScholarDigital Library
- Thompson, J. A. Investigating the Effectiveness of Applying the Critical Incident Technique to Remote Usability Evaluation. Master thesis, Virginia Polytechnic Institute and State University, 1999.Google Scholar
- Thornbury, S. How to teach grammar. Pearson Education Ltd, Harlow, Essex, England, 1999.Google Scholar
- Tullis, T., Fleischman, S., McNulty, M., Cianchette, C. and Bergel, M. An empirical comparison of lab and remote usability testing of web sites. http://home.comcast.net/~tomtullis/publications/RemoteVsLab.pdfGoogle Scholar
- Waterson, S., Landay, J. A. and Matthews, T. In the lab and out in the wild: remote web usability testing for mobile devices. In proc. CHI 2002, ACM Press (2002), 796--797. Google ScholarDigital Library
- Winckler, M. A. A., Freitas, C. M. D. S. and de Lima, J.V. Remote usability testing: a case study. In proc. OzCHI 1999, CHISIG (1999).Google Scholar
Index Terms
- The effect of task assignments and instruction types on remote asynchronous usability testing
Recommendations
What happened to remote usability testing?: an empirical study of three methods
CHI '07: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsThe idea of conducting usability tests remotely emerged ten years ago. Since then, it has been studied empirically, and some software organizations employ remote methods. Yet there are still few comparisons involving more than one remote method. This ...
Let your users do the testing: a comparison of three remote asynchronous usability testing methods
CHI '09: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsRemote asynchronous usability testing is characterized by both a spatial and temporal separation of users and evaluators. This has the potential both to reduce practical problems with securing user attendance and to allow direct involvement of users in ...
Remote usability testing: a practice
JCDL '09: Proceedings of the 9th ACM/IEEE-CS joint conference on Digital librariesFor increasingly frequent use of library resources by remote users, remote usability testing has become a valuable tool for those who would pursue an empirical, user-centered design of the interfaces to their electronic resources and services. This ...
Comments