Skip to main content
Log in

Evaluating teamwork support in tabletop groupware applications using collaboration usability analysis

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

Tabletop groupware systems have natural advantages for collaboration, but they present a challenge for application designers because shared work and interaction progress in different ways than in desktop systems. As a result, tabletop systems still have problems with usability. We have developed a usability evaluation technique, T-CUA, that focuses attention on teamwork issues and that can help designers determine whether prototypes provide adequate support for the basic actions and interactions that are fundamental to table-based collaboration. We compared T-CUA with expert review in a user study where 12 evaluators assessed an early tabletop prototype using one of the two evaluation methods. The group using T-CUA found more teamwork problems and found problems in more areas than those using expert review; in addition, participants found T-CUA to be effective and easy to use. The success of T-CUA shows the benefits of using a set of activity primitives as the basis for discount usability techniques.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Abbreviations

CUA:

Collaboration usability analysis

T-CUA:

Table-collaboration usability analysis

References

  1. Pinelle D, Gutwin C, Greenberg S (2003) Task analysis for groupware usability evaluation: modeling shared-workspace tasks with the mechanics of collaboration. TOCHI 10(4):281–311

    Article  Google Scholar 

  2. Pinelle D, Gutwin C (2002) Groupware walkthrough: adding context to groupware usability evaluation. In: Proceedings CHI 2002, ACM Press, New York, pp 455–462

  3. Baker K, Greenberg S, Gutwin C (2002) Empirical development of a heuristic evaluation methodology for shared workspace groupware. In: Proceeding CSCW 2002, ACM Press, New York, pp 96–105

  4. Steves M, Morse E, Gutwin C, Greenberg S (2001) A comparison of usage evaluation and inspection methods for assessing groupware usability. In: Proceedings ACM GROUP 2001, ACM Press, New York, pp 125–134

  5. Wixon D, Jones S, Tse L, Casaday G (1994) Inspections and design reviews: framework, history, and reflection. In: Nielsen J, Mack R (eds) Usability inspection methods. Wiley, NY, pp 79–104

    Google Scholar 

  6. Bias R (1994) The pluralistic usability walkthrough: coordinated empathies. In: Nielsen J, Mack R (eds) Usability inspection methods. Wiley, NY, pp 65–78

    Google Scholar 

  7. Bias R (1991) Walkthroughs: efficient collaborative testing. IEEE Softw 8(5):94–95

    Article  Google Scholar 

  8. Lewis C, Polson P, Wharton C, Rieman J (1990) Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces. In: Proceedings CHI 1990, ACM Press, New York, pp 235–242

  9. Polson P, Lewis C, Rieman J, Wharton C (1992) Cognitive walkthroughs: a method for theory-based evaluation of user interfaces. Int J Man Mach Stud 36:741–73

    Article  Google Scholar 

  10. Nielsen J, Mack RL (1994) Usability inspection methods. Wiley, NY

    Google Scholar 

  11. Nielsen J, Molich R (1990) Heuristic evaluation of user interfaces. In: Proceedings CHI 1990, ACM Press, New York, pp 249–256

  12. Vredenburg K, Mao J, Smith PW, Carey T (2002) A survey of user-centered design practice. In: Proceedings of CHI 2002, ACM Press, New York, pp 471–478

  13. Cugini J, Damianos L, Hirschman L, Kozierok R, Kurtz J, Laskowski S, Scholtz J (1997) Methodology for evaluation of collaboration systems. Technical report by the Evaluation Working Group of the DARPA Intelligent Collaboration and Visualization Program, Rev. 3.0

  14. Gutwin C, Greenberg S (2000) The mechanics of collaboration: developing low cost usability evaluation methods for shared workspaces. In: Proceedings WET ICE 2000, IEEE Press, pp 98–103

  15. Pinelle D, Gutwin C, Subramanian S (2006) Designing digital tables for highly integrated collaboration. Technical Report HCI-TR-06–02, Computer Science Department, University of Saskatchewan

  16. Kruger R, Carpendale S, Scott SD, Greenberg S (2003) How people use orientation on tables: comprehension, coordination and communication. In: Proceedings GROUP 2003, ACM Press, New York, pp 369–378

  17. Scott SD, Carpendale S, Inkpen KM (2004) Territoriality in collaborative tabletop workspaces. In: Proceedings CSCW 2004, ACM Press, New York

  18. Inkpen K, Mandryk R, Morris DiMicco J, Scott S (2004) Methodologies for evaluation collaboration in co-located environments, workshop proposal in extended abstracts of CSCW 2004, ACM Press, New York

  19. Tang J (1991) Findings from observational studies of collaborative work. Int J Man Mach Stud 34(2):143–160

    Article  Google Scholar 

  20. Diaper D (1989) Task analysis for human–computer interaction. Ellis Horwood, Chichester

    Google Scholar 

  21. Richardson J, Ormerod TC, Shepherd A (1998) The role of task analysis in capturing requirements for interface design. Interact Comput 9:367–384

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to David Pinelle or Carl Gutwin.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Pinelle, D., Gutwin, C. Evaluating teamwork support in tabletop groupware applications using collaboration usability analysis. Pers Ubiquit Comput 12, 237–254 (2008). https://doi.org/10.1007/s00779-007-0145-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-007-0145-4

Keywords

Navigation