Skip to main content

Analysis of Problems Found in User Testing Using an Approximate Model of User Action

  • Conference paper

Abstract

This paper describes an analysis of user testing using an approximate model that separates user action into Goal Formation, Action Specification, and Action Execution. It was found that the majority of the problems found in user testing, as reported within 30 usability reports, were within the Action Specification phase of user action. In particular, problems in finding an action or object and in understanding names used were most prevalent. The implication is that user testing as carried out in an industrial setting might be beneficial to easing Action Specification whilst neglecting potential problems in other phases of user action.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Apple Computer Inc. (1987), Human Interface Guide: The Apple Desktop Interface,Addison-Wesley.

    Google Scholar 

  • Bailey, G. (1993), Cognitive Resources and the Learning of Human-Computer Dialogues, in S. Ashlund, K. Mullet, A. Henderson, E. Hollnagel & T. White (eds.), Proceedings of INTERCHI’93, ACM Press, pp. 198–215.

    Google Scholar 

  • Barnard, P. J. (1987), Cognitive Resources and the Learning of Human-Computer Dialogues, in J. M. Carroll (ed.), Interfacing Thought: Cognitive Aspects of Human-Computer Interaction, Cambridge University Press, pp.112–158.

    Google Scholar 

  • Campbell, R. L. (1990), Developmental Scenario Analysis of Smalltalk Programming, in J. C. Chew & J. Whiteside (eds.), Proceedings of CHI’90: Human Factors in Computing Systems, ACM Press, pp.269–276.

    Google Scholar 

  • Carroll, J. M. & Kellogg, W. A. (1989), Artifact as Theory-nexus: Hermenutics Meets Theory-based Design, in K. Bice & C. H. Lewis (eds.), Proceedings of CHI’89: Human Factors in Computing Systems, ACM Press, pp. 7–14.

    Google Scholar 

  • Cusumano, M. A. & Selby, R. W. (1995), Microsoft Secrets: How the Worlds’ Most Powerful Software Company Creates Technology, Shapes Markets, and Manages People,Free Press.

    Google Scholar 

  • Dieli, M., Dye, K., McClintock, M. & Simpson, M. (1994), The Microsoft Usability Group, in M. E. Wiklund (ed.), Usability in Practice: How Companies Develop User Friendly Products, Academic Press, pp. 327–358.

    Google Scholar 

  • Dillon, A., Sweeney, M. & Maguire, M. (1993), A Survey of Usability Engineering Within the European IT Industry — Current Practice and Needs, in J. Alty, D. Diaper & S. Guest (eds.), People and Computers VIII (Proceedings of HCI’93), Cambridge University Press, pp. 81–94.

    Google Scholar 

  • Erricson, K. A. & Simon, H. A. (1984), Pmtocol Analysis,MIT Press. Analysis of Problems Found in User Testing Using an Approximate Model of User Action 35

    Google Scholar 

  • Gould, J. D., Boies, S. J., Levy, S., Richards, J. T. & Schonard, J. (1987), “The 1984 Olympic Message System — A Test of Behaviour Principles of System Design” Communications of the ACM 30(9), 758–769.

    Google Scholar 

  • Grudin, J. (1991), “Systematic Source of Suboptimal Design in Large Product Development Organizations”, Human-Computer Interaction 6 (2), 147–196.

    Article  Google Scholar 

  • Jefferies, R., Miller, J. R., Wharton, C. & Uyeda, K. (1991), User Interface Evaluation in the Real World: A Comparison of Four Techniques, in S. P. Robertson, G. M. Olson & J. S. Olson (eds.), Proceedings of CHI’91: Human Factors in Computing Systems (Reaching through Technology), ACM Press, pp. 119–124.

    Google Scholar 

  • Landauer, T. K. (1995), The Trouble with Computers: Usefulness, Usability and Productivity,MIT Press.

    Google Scholar 

  • Lee, W. O. & Barnard, P. J. (1993), Precipitating Change in System Usage by Function Revelation and Problem Reformulation, in J. Alty, D. Diaper & S. Guest (eds.), People and Computers VIII (Proceedings of HCI’93), Cambridge University Press, pp. 35–47.

    Google Scholar 

  • Lee, W. O., Dye, K. & Airth, D. (1995), Evaluating Design Specifications Using Heuristic Evaluation in K. Nordby, P. H. Helmersen, D. J. Gilmore & S. A. Amessen (eds.) Human-Computer Interaction — INTERACT’95: Proceedings of the Fifth IFIP Conference on Human-Computer InteractionChapman & Hall, pp.376–379.

    Google Scholar 

  • Mack, R. L. & Montaniz, F. (1995), Observing, Predicting, and Analyzing Usability Problems, in J. Nielsen & R. L. Mack (eds.), Usability Inspection Methods, John Wiley & Sons, pp. 295–340.

    Google Scholar 

  • Mayhew, D. J. & Mantei, M. (1994), A Basic Framework for Cost-Justifying Usability Engineering, in R. G. Bias & D. J. Mayhew (eds.), Cost-Justifying Usability, Academic Press, pp. 9–43.

    Google Scholar 

  • Microsoft Corporation (1995), The Windows Interface Design Guidelines for Software Design, Microsoft Press.

    Google Scholar 

  • Murdock, M. (1996), “Software Design Teams at Iomega”, Interactions 3 (2), 11–14.

    Article  Google Scholar 

  • Nielsen, J. & Molich, R. (1990), Heuristic Evaluation of User Interfaces in J. C. Chew & J. Whiteside (eds.) Proceedings of CHI’90: Human Factors in Computing SystemsACM Press, pp.249–256.

    Google Scholar 

  • Norman, D. A. (1986), Cognitive Engineering, in D. A. Norman & S. W. Draper (eds.), User Centered Systems Design: New Perspectives on Human-Computer Interaction, Lawrence Erlbaum Associates, pp. 31–62.

    Google Scholar 

  • Scholtz, J. (1995), Organizing Usability in the Corporation in K. Nordby, P. H. Helmersen, D. J. Gilmore & S. A. Amessen (eds.) Human-Computer Interaction —INTERACT’95: Proceedings of the Fifth IFIP Conference on Human-Computer InteractionChapman & Hall, pp.372–375.

    Google Scholar 

  • Scriven, M. (1967), The methodology of evaluation, in R. Tyler, R. Gagne & M. Scriven (eds.), Perspectives of Curriculum Evaluation, Rand McNally, pp. 39–83.

    Google Scholar 

  • Whitefield, A., Wilson, F. & Dowell, J. (1991), “A framework for human factors evaluation.”, Behaviour & Information Technology 10 (1), 65–79.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1998 Springer-Verlag London

About this paper

Cite this paper

Lee, W.O. (1998). Analysis of Problems Found in User Testing Using an Approximate Model of User Action. In: Johnson, H., Nigay, L., Roast, C. (eds) People and Computers XIII. Springer, London. https://doi.org/10.1007/978-1-4471-3605-7_2

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-3605-7_2

  • Publisher Name: Springer, London

  • Print ISBN: 978-3-540-76261-4

  • Online ISBN: 978-1-4471-3605-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics