Skip to main content

Remote Automated User Testing: First Steps toward a General-Purpose Tool

  • Chapter
  • 509 Accesses

Part of the book series: Studies in Computational Intelligence ((SCI,volume 296))

Abstract

In this paper we explore options for conducting remote, unattended usability tests to enable users to participate in their own environments and time zones, including multiple users’ participating at the same time in remote, unattended studies. We developed a general purpose tool, and code-named it “TCA” (for “Total Cost of Administration”) to catalog and analyze database administrators’ behavior within software. In this paper, we present example findings from the data collected over a period of 6 months. We analyzed users’ deviations from the best paths through the software, in addition to collecting traditional measures such as time on task, error rate, and users’ perceptions, including satisfaction level. Further, we explore how this type of tool offers particular promise in benchmark studies, also collecting the ideal best-path performance that assumes error-free, expert user behavior, to compare to more real-world data collected initially in laboratory tests or over time in longitudinal field studies.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Badre, A., Paulo, S.J.: Knowledge-Based System for Capturing Human-Computer Interaction Events: CHIME. GVU Technical Report, GIT-GVU-91-21 (1991)

    Google Scholar 

  2. BenchmarkSQL: http://sourceforge.net/projects/benchmarksql

  3. Brainard, J., Becker, B.: Case Study: E-Commerce Clickstream Visualization. In: Proc. IEEE Proceedings of Information Visualization, pp. 151–155 (2001)

    Google Scholar 

  4. Card, S., Moran, T., Newel, A.: The Psychology of Human-Computer Interaction. Lawrence Erlbaum Associates, Mahwah (1983)

    Google Scholar 

  5. Ferrer, D.F., Mead, M.: Uncovering the Spy Network: is Spyware watching your library computers? Computers in Libraries 23(5), 16–21 (2003)

    Google Scholar 

  6. Greenberg, S., Buxton, B.: Usability evaluation considered harmful (some of the time). In: Proc. CHI 2008, pp. 111–120. ACM Press, New York (2008)

    Google Scholar 

  7. Good, M.: The use of logging data in the design of a new text editor. In: Proc. CHI 1985, pp. 93–97. ACM Press, New York (1985)

    Google Scholar 

  8. Hartson, H.R., Andre, T.S., Williges, R.C.: Criteria For Evaluating Usability Evaluation Methods. International Journal of Human-Computer Interaction, 373–410 (2001)

    Google Scholar 

  9. Hammontree, M., Weiler, P., Nayak, N.: Remote usability testing. Interactions 1(3), 21–25 (1994)

    Article  Google Scholar 

  10. Kieras, D.E., Wood, S.D., Kasem, A., Hornof, A.: GLEAN: A Computer-Based Tool for Rapid GOMS Model Usability Evaluation of User Interface Design. In: Proc. UIST 1995, pp. 91–100. ACM Press, New York (1995)

    Chapter  Google Scholar 

  11. LiveMeeting: http://office.microsoft.com/enus/livemeeting/

  12. Morae: http://www.techsmith.com/morae.asp

  13. MSAA: http://www.microsoft.com/enable/

  14. Nielsen, J.: Guerrilla HCI: using discount usability engineering to penetrate the intimidation barrier. Cost-justifying usability, 245–272 (1994)

    Google Scholar 

  15. Nielsen, J.: Usability Engineering (1993)

    Google Scholar 

  16. Offcial Oracle Benchmark Kits: http://www.cisecurity.org/bench_oracle.html

  17. Olsen, D.R., Halversen, B.W.: Interface usage measurements in a user interface management system. In: Proc. SIGGRAPH 1988, pp. 102–108. ACM Press, New York (1988)

    Google Scholar 

  18. OvoStudio, http://www.ovostudios.com

  19. Palmiter, S., Lynch, G., Lewis, S., Stempski, M.: Breaking away from the conventional ‘usability lab’: the Customer-Centered Design Group at Tektronix, Inc. Behaviour & Information Technology, 128–131 (1994)

    Google Scholar 

  20. PolePosiion, http://www.polepos.org/

  21. Silverback, http://www.silverbackapp.com

  22. SPECjAppServer, http://www.spec.org/order.html

  23. Soderston, C.: The Usability Edit: A New Level. Technical Communication 32(1), 16–18 (1985)

    Google Scholar 

  24. Soderston, C., Rauch, T.L.: The case for user-centered design. In: Society for Technical Communication annual conference, Denver, Co, USA (1996)

    Google Scholar 

  25. Soderston, C.: An Experimental Study of Structure for Online Information. In: 34th International Technical Communication Conference, Society for Technical Communication (1987)

    Google Scholar 

  26. Stafford, T.F., Urbaczewski, A.: Spyware: The ghost in the machine. Communications of the Association for Information Systems 14, 291–306 (2004)

    Google Scholar 

  27. Stones, C., Sobol, S.: DMASC: A Tool for Visualizing User Paths through a Web Site. In: Proc. International Workshop on Database and Expert Systems Applications 2002 (DEXA 2002), p. 389 (2002)

    Google Scholar 

  28. Swingbench: http://www.dominicgiles.com/swingbench.html

  29. Tullis, T., Fleischman, S., McNulty, M., Cianchette, C., Bergel, M.: An Empirical Comparison of Lab and Remote Usability Testing of Web Sites. In: Usability Professionals Association Conference (2002)

    Google Scholar 

  30. Udraw: http://www.informatik.unibremen.de/uDrawGraph

  31. UserFocus, http://www.userfocus.co.uk/articles/dataloggingtools.html

  32. UserVue, http://www.techsmith.com/uservue.asp

  33. Virzi, R.A.: Refining the Test Phase of Usability Evaluation: How Many Subjects Is Enough? Human Factors: The Journal of the Human Factors and Ergonomics Society, 457–468 (1992)

    Google Scholar 

  34. Waterson, S., Landay, J.A., Matthews, T.: In the lab and out in the wild: remote web usability testing for mobile devices. In: Proc. CHI 2002, pp. 796–797. ACM Press, New York (2002)

    Google Scholar 

  35. Weiler, P.: Software for usability lab: a sampling of current tools. In: Proc. CHI 1993, pp. 57–60. ACM Press, New York (1993)

    Chapter  Google Scholar 

  36. West, R., Lehman, K.R.: Automated Summative Usability Studies: An Empirical Evaluation. In: Proc. CHI 2006, pp. 631–639. ACM Press, New York (2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Sarkar, C., Soderston, C., Klementiev, D., Bell, E. (2010). Remote Automated User Testing: First Steps toward a General-Purpose Tool. In: Lee, R., Ormandjieva, O., Abran, A., Constantinides, C. (eds) Software Engineering Research, Management and Applications 2010. Studies in Computational Intelligence, vol 296. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-13273-5_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-13273-5_3

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-13272-8

  • Online ISBN: 978-3-642-13273-5

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics