Abstract
In this paper we explore options for conducting remote, unattended usability tests to enable users to participate in their own environments and time zones, including multiple users’ participating at the same time in remote, unattended studies. We developed a general purpose tool, and code-named it “TCA” (for “Total Cost of Administration”) to catalog and analyze database administrators’ behavior within software. In this paper, we present example findings from the data collected over a period of 6 months. We analyzed users’ deviations from the best paths through the software, in addition to collecting traditional measures such as time on task, error rate, and users’ perceptions, including satisfaction level. Further, we explore how this type of tool offers particular promise in benchmark studies, also collecting the ideal best-path performance that assumes error-free, expert user behavior, to compare to more real-world data collected initially in laboratory tests or over time in longitudinal field studies.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Badre, A., Paulo, S.J.: Knowledge-Based System for Capturing Human-Computer Interaction Events: CHIME. GVU Technical Report, GIT-GVU-91-21 (1991)
BenchmarkSQL: http://sourceforge.net/projects/benchmarksql
Brainard, J., Becker, B.: Case Study: E-Commerce Clickstream Visualization. In: Proc. IEEE Proceedings of Information Visualization, pp. 151–155 (2001)
Card, S., Moran, T., Newel, A.: The Psychology of Human-Computer Interaction. Lawrence Erlbaum Associates, Mahwah (1983)
Ferrer, D.F., Mead, M.: Uncovering the Spy Network: is Spyware watching your library computers? Computers in Libraries 23(5), 16–21 (2003)
Greenberg, S., Buxton, B.: Usability evaluation considered harmful (some of the time). In: Proc. CHI 2008, pp. 111–120. ACM Press, New York (2008)
Good, M.: The use of logging data in the design of a new text editor. In: Proc. CHI 1985, pp. 93–97. ACM Press, New York (1985)
Hartson, H.R., Andre, T.S., Williges, R.C.: Criteria For Evaluating Usability Evaluation Methods. International Journal of Human-Computer Interaction, 373–410 (2001)
Hammontree, M., Weiler, P., Nayak, N.: Remote usability testing. Interactions 1(3), 21–25 (1994)
Kieras, D.E., Wood, S.D., Kasem, A., Hornof, A.: GLEAN: A Computer-Based Tool for Rapid GOMS Model Usability Evaluation of User Interface Design. In: Proc. UIST 1995, pp. 91–100. ACM Press, New York (1995)
LiveMeeting: http://office.microsoft.com/enus/livemeeting/
Nielsen, J.: Guerrilla HCI: using discount usability engineering to penetrate the intimidation barrier. Cost-justifying usability, 245–272 (1994)
Nielsen, J.: Usability Engineering (1993)
Offcial Oracle Benchmark Kits: http://www.cisecurity.org/bench_oracle.html
Olsen, D.R., Halversen, B.W.: Interface usage measurements in a user interface management system. In: Proc. SIGGRAPH 1988, pp. 102–108. ACM Press, New York (1988)
OvoStudio, http://www.ovostudios.com
Palmiter, S., Lynch, G., Lewis, S., Stempski, M.: Breaking away from the conventional ‘usability lab’: the Customer-Centered Design Group at Tektronix, Inc. Behaviour & Information Technology, 128–131 (1994)
PolePosiion, http://www.polepos.org/
Silverback, http://www.silverbackapp.com
SPECjAppServer, http://www.spec.org/order.html
Soderston, C.: The Usability Edit: A New Level. Technical Communication 32(1), 16–18 (1985)
Soderston, C., Rauch, T.L.: The case for user-centered design. In: Society for Technical Communication annual conference, Denver, Co, USA (1996)
Soderston, C.: An Experimental Study of Structure for Online Information. In: 34th International Technical Communication Conference, Society for Technical Communication (1987)
Stafford, T.F., Urbaczewski, A.: Spyware: The ghost in the machine. Communications of the Association for Information Systems 14, 291–306 (2004)
Stones, C., Sobol, S.: DMASC: A Tool for Visualizing User Paths through a Web Site. In: Proc. International Workshop on Database and Expert Systems Applications 2002 (DEXA 2002), p. 389 (2002)
Swingbench: http://www.dominicgiles.com/swingbench.html
Tullis, T., Fleischman, S., McNulty, M., Cianchette, C., Bergel, M.: An Empirical Comparison of Lab and Remote Usability Testing of Web Sites. In: Usability Professionals Association Conference (2002)
UserFocus, http://www.userfocus.co.uk/articles/dataloggingtools.html
UserVue, http://www.techsmith.com/uservue.asp
Virzi, R.A.: Refining the Test Phase of Usability Evaluation: How Many Subjects Is Enough? Human Factors: The Journal of the Human Factors and Ergonomics Society, 457–468 (1992)
Waterson, S., Landay, J.A., Matthews, T.: In the lab and out in the wild: remote web usability testing for mobile devices. In: Proc. CHI 2002, pp. 796–797. ACM Press, New York (2002)
Weiler, P.: Software for usability lab: a sampling of current tools. In: Proc. CHI 1993, pp. 57–60. ACM Press, New York (1993)
West, R., Lehman, K.R.: Automated Summative Usability Studies: An Empirical Evaluation. In: Proc. CHI 2006, pp. 631–639. ACM Press, New York (2006)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Sarkar, C., Soderston, C., Klementiev, D., Bell, E. (2010). Remote Automated User Testing: First Steps toward a General-Purpose Tool. In: Lee, R., Ormandjieva, O., Abran, A., Constantinides, C. (eds) Software Engineering Research, Management and Applications 2010. Studies in Computational Intelligence, vol 296. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-13273-5_3
Download citation
DOI: https://doi.org/10.1007/978-3-642-13273-5_3
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-13272-8
Online ISBN: 978-3-642-13273-5
eBook Packages: EngineeringEngineering (R0)