Skip to main content

Seeing the System through the End Users’ Eyes: Shadow Expert Technique for Evaluating the Consistency of a Learning Management System

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNPSE,volume 5889))

Abstract

Interface consistency is an important basic concept in web design and has an effect on performance and satisfaction of end users. Consistency also has significant effects on the learning performance of both expert and novice end users. Consequently, the evaluation of consistency within a e-learning system and the ensuing eradication of irritating discrepancies in the user interface redesign is a big issue. In this paper, we report of our experiences with the Shadow Expert Technique (SET) during the evaluation of the consistency of the user interface of a large university learning management system. The main objective of this new usability evaluation method is to understand the interaction processes of end users with a specific system interface. Two teams of usability experts worked independently from each other in order to maximize the objectivity of the results. The outcome of this SET method is a list of recommended changes to improve the user interaction processes, hence to facilitate high consistency.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Chu, L.F., Chan, B.K.: Evolution of web site design: implications for medical education on the Internet. Computers in Biology and Medicine 28(5), 459–472 (1998)

    Article  Google Scholar 

  2. Shneiderman, B.: Designing the User Interface. Strategies for effective Human-Computer Interaction, 3rd edn. Addison-Wesley, Reading (1997)

    Google Scholar 

  3. Rubinstein, R., Hersh, H.: The human factor. Digital Press, Bedford (1984)

    Google Scholar 

  4. Grudin, J.: The Case against User Interface Consistency. Communications of the ACM 32(10), 1164–1173 (1989)

    Article  Google Scholar 

  5. Rhee, C., Moon, J., Choe, Y.: Web interface consistency in e-learning. Online Information Review 30(1), 53–69 (2006)

    Article  Google Scholar 

  6. Nielsen, J.: Coordinating User Interfaces for Consistency. The Morgan Kaufmann Series in Interactive Technologies. Morgan Kaufmann, San Francisco (2001)

    Google Scholar 

  7. Tanaka, T., Eberts, R.E., Salvendy, G.: Consistency of Human-Computer Interface Design - Quantification and Validation. Human Factors 33(6), 653–676 (1991)

    Google Scholar 

  8. Ozok, A.A., Salvendy, G.: Measuring consistency of web page design and its effects on performance and satisfaction. Ergonomics 43(4), 443–460 (2000)

    Article  Google Scholar 

  9. Satzinger, J.W.: The effects of conceptual consistency on the end user’s mental models of multiple applications. Journal of End User Computing 10(3), 3–14 (1998)

    Google Scholar 

  10. Satzinger, J.W., Olfman, L.: User interface consistency across end-user applications: the effects on mental models. Journal of Management Information Systems 14(4), 167–193 (1998)

    Google Scholar 

  11. Norman, D.A., Draper, S.: User Centered System Design. Erlbaum, Hillsdale (1986)

    Google Scholar 

  12. Holzinger, A.: User-Centered Interface Design for disabled and elderly people: First experiences with designing a patient communication system (PACOSY). In: Miesenberger, K., Klaus, J., Zagler, W.L. (eds.) ICCHP 2002. LNCS, vol. 2398, pp. 33–41. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  13. Norman, D.A.: Cognitive engineering. In: Norman, D., Draper, S. (eds.) User Centered System Design: New Perspectives on Human-Computer interaction. Erlbaum, Mahwah (1986)

    Google Scholar 

  14. Holzinger, A., Kickmeier-Rust, M., Albert, D.: Dynamic Media in Computer Science Education; Content Complexity and Learning Performance: Is Less More? Educational Technology & Society 11(1), 279–290 (2008)

    Google Scholar 

  15. Holzinger, A., Kickmeier-Rust, M.D., Wassertheurer, S., Hessinger, M.: Learning performance with interactive simulations in medical education: Lessons learned from results of learning complex physiological models with the HAEMOdynamics SIMulator. Computers & Education 52(2), 292–301 (2009)

    Article  Google Scholar 

  16. Krug, S.: Don’t Make Me Think: A Common Sense Approach to Web Usability. New Riders, Indianapolis (2000)

    Google Scholar 

  17. Holzinger, A.: Usability Engineering for Software Developers. Communications of the ACM 48(1), 71–74 (2005)

    Article  Google Scholar 

  18. Nielsen, J., Molich, R.: Heuristic evaluation of user interfaces. In: CHI 1990, pp. 249–256. ACM, New York (1990)

    Chapter  Google Scholar 

  19. Kamper, R.J.: Extending the usability of heuristics for design and evaluation: Lead, follow get out of the way. International Journal of Human-Computer Interaction 14(3-4), 447–462 (2002)

    Article  Google Scholar 

  20. Hvannberg, E.T., Law, E.L.C., Larusdottir, M.K.: Heuristic evaluation: Comparing ways of finding and reporting usability problems. Interacting with Computers 19(2), 225–240 (2007)

    Article  Google Scholar 

  21. Nielsen, J.: Finding usability problems through heuristic evaluation. In: CHI 1992, pp. 373–380 (1992)

    Google Scholar 

  22. Javahery, H., Seffah, A.: Refining the usability engineering toolbox: lessons learned from a user study on a visualization tool. In: Holzinger, A. (ed.) USAB 2007. LNCS, vol. 4799, pp. 185–198. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  23. Bailey, R.W., Wolfson, C.A., Nall, J., Koyani, S.: Performance-Based Usability Testing: Metrics That Have the Greatest Impact for Improving a System’s Usability. In: Kurosu, M. (ed.) Human Centered Design HCII 2009. LNCS, vol. 5619, pp. 3–12. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  24. Virzi, R.A.: Refining the test phase of usability evaluation: how many subjects is enough? Human Factors 34(4), 457–468 (1992)

    Google Scholar 

  25. Nielsen, J.: Usability Metrics: Tracking Interface Improvements. IEEE Software 13(6), 12–13 (1996)

    Google Scholar 

  26. Bevan, N.: Measuring Usability as Quality of Use. Software Quality Journal 4(2), 115–130 (1995)

    Article  Google Scholar 

  27. Thomas, C., Bevan, N.: Usability Context Analysis: A Practical Guide. National Physical Laboratory, Teddington (1996)

    Google Scholar 

  28. Bevan, N.: Quality in Use: Incorporating Human Factors into the Software Engineering Lifecycle. In: 3rd International Software Engineering Standards Symposium (ISESS 1997), pp. 169–179 (1997)

    Google Scholar 

  29. Macleod, M., Bowden, R., Bevan, N., Curson, I.: The MUSiC performance measurement method. Behaviour & Information Technology 16(4-5), 279–293 (1997)

    Article  Google Scholar 

  30. Bevan, N.: Extending Quality in Use to Provide a Framework for Usability Measurement. In: Kurosu, M. (ed.) Human Centered Design HCII 2009. LNCS, vol. 5619, pp. 13–22. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  31. Stickel, C., Scerbakov, A., Kaufmann, T., Ebner, M.: Usability Metrics of Time and Stress - Biological Enhanced Performance Test of a University Wide Learning Management System. In: Holzinger, A. (ed.) 4th Symposium of the Workgroup Human-Computer Interaction and Usability Engineering of the Austrian-Computer-Society, pp. 173–184. Springer, Berlin (2008)

    Google Scholar 

  32. Seffah, A., Metzker, E.: The obstacles and myths of usability and software engineering. Communications of the ACM 47(12), 71–76 (2004)

    Article  Google Scholar 

  33. Seffah, A., Donyaee, M., Kline, R.B., Padda, H.K.: Usability measurement and metrics: A consolidated model. Software Quality Journal 14(2), 159–178 (2006)

    Article  Google Scholar 

  34. Holzinger, A., Searle, G., Kleinberger, T., Seffah, A., Javahery, H.: Investigating Usability Metrics for the Design and Development of Applications for the Elderly. In: Miesenberger, K., Klaus, J., Zagler, W.L., Karshmer, A.I. (eds.) ICCHP 2008. LNCS, vol. 5105, pp. 98–105. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  35. Bevan, N.: Usability is Quality of Use. In: Anzai, Y., Ogawa, K., Mori, H. (eds.) 6th International Conference on Human Computer Interaction. Elsevier, Amsterdam (1995)

    Google Scholar 

  36. Kirakowski, J., Corbett, M.: SUMI: The Software Usability Measurement Inventory. British Journal of Educational Technology 24(3), 210–212 (1993)

    Article  Google Scholar 

  37. Brooke, J.: SUS: A "quick and dirty" usability scale. In: Jordan, P.W., Thomas, B., Weerdmeester, B.A., McClelland, A.L. (eds.) Usability Evaluation in Industry. Taylor & Francis, Abington (1996)

    Google Scholar 

  38. Raskin, J.: The Humane Interface: New Directions for Designing Interactive Systems. Addison-Wesley-Longman, Boston (2000)

    Google Scholar 

  39. Harbich, S., Auer, S.: Rater bias: The influence of hedonic quality on usability questionnaires. In: Costabile, M.F., Paternó, F. (eds.) INTERACT 2005. LNCS, vol. 3585, pp. 1129–1133. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  40. Smith, L.A., Turner, E.: Using Camtasia to develop and enhance online learning: tutorial presentation. Journal of Computing Sciences in Colleges 22(5), 121–122 (2007)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Holzinger, A., Stickel, C., Fassold, M., Ebner, M. (2009). Seeing the System through the End Users’ Eyes: Shadow Expert Technique for Evaluating the Consistency of a Learning Management System. In: Holzinger, A., Miesenberger, K. (eds) HCI and Usability for e-Inclusion. USAB 2009. Lecture Notes in Computer Science, vol 5889. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-10308-7_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-10308-7_12

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-10307-0

  • Online ISBN: 978-3-642-10308-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics