Skip to main content
Log in

Web based evaluation of proactive user interfaces

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

Usability evaluation of new interface concepts often requires a user study to yield valid results. User studies are however a cost intensive method compared to guideline based usability evaluation. In the AUGUR project we conducted a user study to validate a new interface concept, proactive user interfaces. The method for evaluation we used consists of four steps and relies on leveraging existing tools for realizing each step. These tools can highly automate the data gathering step in the usability study and thereby lower the cost for conducting such a study. In particular they allow for remote participation via a web browser. The obtained data can be analyzed to gain insight into the relation of factors to the three subnotions of usability: efficiency, effectiveness and satisfaction as defined in ISO 9241. We applied the developed method to evaluate the proactive user interfaces we develop in the AUGUR project. In the AUGUR project we aim at augmenting existing user interfaces with proactive and multimodal features to enhance the overall usability of the application. In this paper, we also present the results of the study that shows that proactive augmentations are beneficial for the usability of a user interface and serves as a case study for the application of the evaluation method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Nielsen J (1999) Designing web usability. New Riders, Indianapolis

    Google Scholar 

  2. Ivory MY, Sinha RR, Hearst MA (2001) Empirically validated web page design metrics. In: Proceedings of CHI ’01, pp 53–60

  3. Palmer JW (2002) Web site usability, design, and performance metrics. Inf Syst Res 13(2):151–167

    Article  Google Scholar 

  4. John BE, Kieras DE (1996) Using goms for user interface design and evaluation: which technique? ACM Trans Comput-Hum Interact 3(4):287–319

    Article  Google Scholar 

  5. Nielsen J (1994) Usability engineering. Morgan Kaufmann, San Mateo

    Google Scholar 

  6. Hicinbothom J, Watanabe M, Weiland WJ, Boardway J, Zachary W (1994) A toolset for systematic observation and evaluation of computer-human interaction. In: CHI ’94: conference companion on human factors in computing systems, pp 5–6

  7. Scholtz J (2001) Adaptation of traditional usability testing methods for remote testing. In: Proceedings of HICSS ’01. IEEE Computer Society, Washington

    Google Scholar 

  8. Atterer R, Schmidt A (2007) Tracking the interaction of users with Ajax applications for usability testing. In: Proceedings of CHI ’07. ACM, New York, pp 1347–1350

    Google Scholar 

  9. Klug T, Mühlhäuser M (2007) Taskobserver: a tool for computer aided observations in complex mobile situations. In: Proceedings of the international conference on mobile technology, applications and systems (mobility), September 2007

  10. Resig J, The jQuery Team. jquery. http://jquery.com/

  11. Ozok AA (2007) In: Survey design and implementation in human computer interaction, 2nd edn. Lawrence Erlbaum Associates, Hillsdale, pp 1151–1169

    Google Scholar 

  12. Hartmann M, Schreiber D, Kaiser M (2007) Task models for proactive web applications. In: Proceedings of WEBIST ’07, pp. 150–155, INSTICC Press, March 2007

  13. Findlater L, McGrenere J (2004) A comparison of static, adaptive, and adaptable menus. In: Proceedings of CHI ’04. ACM, New York, pp 89–96

    Google Scholar 

  14. Sears A, Shneiderman B (1994) Split menus: effectively using selection frequency to organize menus. ACM Trans Comput-Hum Interact 1(1):27–51

    Article  Google Scholar 

  15. Mitchell J, Shneiderman B (1989) Dynamic versus static menus: an exploratory comparison. SIGCHI Bull 20(4):33–37

    Article  Google Scholar 

  16. Gajos KZ, Czerwinski M, Tan DS, Weld DS (2006) Exploring the design space for adaptive graphical user interfaces. In: Proceedings of AVI ’06. ACM, New York

    Google Scholar 

  17. Google suggest http://www.google.com/webhp?complete=1

  18. Carroll JM, Rosson MB (1987) Paradox of the active user. In: Interfacing thought: cognitive aspects of human-computer interaction. MIT Press, Cambridge. Chap 5

    Google Scholar 

  19. Horvitz E, Breese JS, Heckerman D, Hovel D, Rommelse K (1998) The Lumiere project: Bayesian user modeling for inferring the goals and needs of software users. In: Proceedings of UAI ’98, pp 256–265

  20. Kahneman D (1973) Attention and effort. Prentice Hall, Englewood Cliffs

    Google Scholar 

  21. Ries S, Schreiber D (2008) Evaluating user representations for the trustworthiness of interaction partners. In: ReColl workshop at IUI’08. ACM, New York

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Daniel Schreiber.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Schreiber, D., Hartmann, M., Flentge, F. et al. Web based evaluation of proactive user interfaces. J Multimodal User Interfaces 2, 61–72 (2008). https://doi.org/10.1007/s12193-008-0001-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-008-0001-5

Keywords

Navigation