Skip to main content
Log in

Gesture analysis in a case study with a tangible user interface for collaborative problem solving

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

This paper describes a case study that took place at the Public Research Centre Henri Tudor, Luxembourg in November 2012. A tangible user interface (TUI) was used in the context of collaborative problem solving. The task of participants was to explore the relation of external parameters on the production of electricity of a windmill presented on a tangible tabletop; these parameters were represented through physical objects. The goal of the study was to observe, analyze, and understand the interactions of multiple participants with the table while collaboratively solving a task. In this paper we focus on the gestures that the users performed during the experiment and the reaction of the other users to those gestures. Gestures were categorized into deictic/pointing, iconic, emblems, adaptors, and TUI-related. TUI-related/manipulative gestures, such as tracing and rotating, represented the biggest part, followed by the pointing gestures. In addition, we evaluated how active was the participation of the participants and whether gesture was accompanied by speech during the user study. Our case study can be described as a collaborative, problem solving, and cognitive activity, which showed that gesturing facilitates group focus, enhances collaboration among the participants, and encourages the use of epistemic actions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Ullmer B, Ishii H (2000) Emerging frameworks for tangible user interfaces. IBM Syst J 39:915–931

    Article  Google Scholar 

  2. Shaer O, Hornecker E (2010) Tangible user interfaces: past, present and future directions. Found Trends Hum Comput Interact 3(1–2):1–137

  3. Schraw G, Robinson DR (2011) Assessment of higher order thinking skills. IAP-Information Age Publishing Inc, Charlotte

    Google Scholar 

  4. Ishii H (2006) Tangible user interfaces. In: Proceedings of the CHI 2006 Workshop, ACM

  5. Ishii H (2008) The tangible user interface and its evolution. Commun ACM 51(6):32–36

  6. Marshall P, Rogers Y, Hornecker E (2007) Are tangible interfaces really any better than other kinds of interfaces? In: Proceedings of CHI 2007 workshop on tangible user interfaces in context and theory

  7. Horn MS, Solovey ET, Crouser RJ, Jacob RJK (2009) Comparing the use of tangible and graphical programming languages for informal science education. In: Proceedings of CHI 2009, pp 975–984

  8. Cheng LK, Der CS, Sidhu MS, Omar R (2011) GUI vs. TUI: engagement for children with no prior computing experience. Electron J Comput Sci Inf Technol 3:31–39

    Google Scholar 

  9. Xie L, Antle AN, Motamedi N (2008) Are tangibles more fun? Comparing children’s enjoyment and engagement using physical, graphical and tangible user interfaces. In: Proceedings of TEI 2008, pp 191–198

  10. Tuddenham P, Kirk D, Izadi S (2010) Graspables revisited: multi-touch vs. tangible input for tabletop displays inacquisition and manipulation tasks. In: Proceedings of CHI 2010, pp 2223–2232

  11. Zuckerman O, Gal-Oz A (2013) To TUI or not to TUI: evaluating performance and preference in tangible vs. graphical user interfaces. Int J Hum Comput Stud 71:803–820

    Article  Google Scholar 

  12. Patten J, Ishii H (2000) A comparison of spatial organizationstrategies in graphical and tangible user interfaces. In: Proceedings of DARE 2000, pp 41–50

  13. Fitzmaurice GW, Buxton W (1997) An empirical evaluation of graspable user interfaces: towards specialized, space multiplexed input. In: Proceedings of CHI 1997, pp 43–50

  14. Marshall P, Cheng PCH, Luckin R (2010) Tangibles in the balance: a discovery learning task with physical or graphical materials. In: Proceedings of TEI 2010, pp 153–160

  15. Esteves A, Van den Hoven E, Oakley I (2013) Physical games or digital games? Comparing support for mental projection in tangible and virtual representations of a problem-solving task. In: Proceedings of TEI 2013, pp 167–174

  16. Kirsh D, Maglio P (1994) On distinguishing epistemic from pragmatic action. Cogn Sci 18(4):513–549

    Article  Google Scholar 

  17. Fitzmaurice GW (1996) Graspable user interfaces. PhD thesis, 1996, University of Toronto

  18. Sharlin E, Itoh Y, Watson B, Kitamura Y, Sutphen S, Liu L, Kishino F (2004) Spatial tangible user interfaces for cognitive assessment and training. In: Proceedings of Bio-ADIT, pp 410–425

  19. Maher ML, Kim MJ (2005) Do tangible user interfaces impact spatial cognition in collaborative design? Coop Design Vis Eng Lect Notes Comput Sci 3675:30–41

    Article  Google Scholar 

  20. Efron D (1941/1972) Gesture, race and culture. The Hague: Mouton

  21. Ekman P, Friesen WV (1972) Hand movements. J Commun 22:353–374

    Article  Google Scholar 

  22. Krauss RM, Chen Y, Chawla P (1996) Nonverbal behavior and nonverbal communication: what do conversational hand gestures tell us? Adv Exp Soc Psychol 28:389–450

    Article  Google Scholar 

  23. McNeill D (192) Hand and mind: what gestures reveal about thought. University of Chicago Press, Chicago

    Google Scholar 

  24. Goodwin C (1994) Professional vision. Am Anthropol 96(3):606–633

    Article  MathSciNet  Google Scholar 

  25. Murphy KM (2003) Building meaning in interaction: rethinking gesture classifications. Crossroad Lang Interact Cult 5:29–47

    Google Scholar 

  26. Lee J, Ishii H (2010) Beyond—collapsible tools and gestures for computational design. In: Proceedings of CHI 2010, pp 3931–3936

  27. Goldin-Meadow S, Nusbaum H, Delly SD, Wagner S (2001) Explaining math: gesturing lightens the load. Psychol Sci 12(6):516–522

    Article  Google Scholar 

  28. Alibali MW, Kita S, Young A (2000) Gesture and the process of speech production: we think, therefore we gesture. Lang Cogn Process 15:593–613

    Article  Google Scholar 

  29. Morsella E, Krauss RM (2004) The role of gestures in spatial working memory and speech. Am J Psychol 117(3):411–424

    Article  Google Scholar 

  30. Ping R, Goldin-Meadow S (2010) Gesturing saves cognitive resources when talking about nonpresent objects. Cogn Sci 34:602–619

    Article  Google Scholar 

  31. Klemmer SR, Hartmann B, Takayama L (2006) How bodies matter: five themes for interaction design. In: Proceedings of DIS 2006 conference on designing interactive systems, pp 140–149

  32. Kirk DS, Sellen A, Taylor S, Villar N, Izadi S (2009) Putting the physical into the digital: issues in designing hybrid interactive surfaces. In: Proceedings of BCS HCI 2009, pp 35–54

  33. Bekker MM, Olson JS, Olson GM (1995) Analysis of gestures in face-to-face design teams provides guidance for how to use groupware in design. In: DIS 1995, pp 157–166

  34. Tang JC (1991) Findings from observational studies of collaborative work. Int J Man Mach Stud 34:143–160

    Article  Google Scholar 

  35. Suzuki H, Kato H (1995) Interaction-level support for collaborative learning: AlgoBlock—open programming language. In: Proceedings of the conference on computer-supported collaborative learning (CSCL) 1995, pp 349–355

  36. Kim MJ, Maher ML (2008) The impact of tangible user interfaces on designers’ spatial cognition. Hum Comput Interact 23(2):101–137

    Article  Google Scholar 

  37. Maquil V, Ras E (2012) Collaborative problem solving with objects: ophysical aspects of a tangible tabletop in technology-based assessment. From research to practice in the design of cooperative systems: results and open challenges, pp 153–166

  38. Ras E, Maquil V, Foulonneau M, Latour T (2013) Empirical studies on a tangible user interface for technology-based assessment: insights and emerging challenges. Special CAA 2012 Issue: pedagogy and technology: harmony and tensions international. J e-Assess, 3(1)

  39. Davis FD (1986) A technology acceptance model for empirically testing new end user information systems: theory and results. PhD, Massachusetts Institute of Technology

  40. Brooke J (1996) Usability evaluation in industry. In: Jordan PW, Thomas B, Weerdmeester BA, Mcclelland IL (eds) Sus—a quick and dirty usability scale. Taylor & Francis, London

    Google Scholar 

  41. Venkatesh V, Morris MG, Davis GB, Davis FD (2003) User acceptance of information technology: toward a unified view. MIS Q 27(3):425–478

    Google Scholar 

  42. Attrakdiff. A tool for measuring hedonic and pragmatic quality. http://www.attrakdiff.de

  43. Wittenburg P, Brugman H, Russel A, Klassmann A, Sloetjes H (2006) ELAN: a professional framework for multimodality research. In: Proceedings of the 5th international conference on language resources and evaluation (LREC)

  44. McNeill D (2005) Gestures and thought. University of Chicago Press, Chicago

    Book  Google Scholar 

  45. Kendon A (1982) The study of gesture: some observations on its history. Rech Semiot Semiot Inq 2(1):25–62

  46. Kipp M (2004) Gesture generation by imitation—from human behavior to computer character animation. Dissertation.com, Boca Raton, Florida

  47. Quek F (1994) Toward a vision-based hand gesture interface. In: Singh G, Feiner SK, Thalman D (eds) Proceedings of the virtual reality, software and technology conference, pp 17–31

  48. Wexelblat A (1998) Research challenges in gesture: open issues and unsolved problems. In: Procedings of the international gesture workshop on gesture and sign language in human–computer interaction, pp 1–11

  49. Maquil V, Wagner I, Basile M, Ehrenstrasser L, Idziorek M, Ozdirlik B, Sareika M, Terrin JJ, Wagner M (2010) WP6 final prototype of Urban renewal applications, integrated project on interaction and presence in Urban environments

  50. North M (1972) Personality assessment through movement. Macdonald and Evans, Plymouth

    Google Scholar 

  51. Argyle M (1988) Bodily communication. Taylor & Francis, London

  52. Lippa R (1998) The nonverbal display and judgment of extraversion, masculinity, femininity, and gender diagnosticity: a lens model analysis. J Res Person 32(1):80–107

    Article  Google Scholar 

  53. Fleck R, Rogers Y, Yuill N, Marshall P, Carr A, Rick J, Bonnett V (2009) Actions speak loudly with words: unpacking collaboration around the table. In: Proceedings of the ACM international conference on interactive tabletops and surfaces, pp 189–196

  54. Stahl G (2005) Group cognition in computer-assisted collaborative learning. J Comput Assist Learn 21(2):79–90

    Article  Google Scholar 

  55. Fiske ST, Taylor SE (2013) Social cognition from brains to culture. SAGE Publications, Thousands Oaks

    Google Scholar 

  56. Kita S (2009) Cross-cultural variation of speech-accompanying gesture: a review. Lang Cogn Process 24(2):145–167

    Article  Google Scholar 

  57. Kray C, Strohbach M (2004) Gesture-based interface reconfiguration. In: Proceedings of workshop on AI in mobile systems (AIMS) at Ubicomp

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eric Ras.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Anastasiou, D., Maquil, V. & Ras, E. Gesture analysis in a case study with a tangible user interface for collaborative problem solving. J Multimodal User Interfaces 8, 305–317 (2014). https://doi.org/10.1007/s12193-014-0158-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-014-0158-z

Keywords

Navigation