skip to main content
10.1145/1133265.1133336acmotherconferencesArticle/Chapter ViewAbstractPublication PagesaviConference Proceedingsconference-collections
Article

Enabling interaction with single user applications through speech and gestures on a multi-user tabletop

Published: 23 May 2006 Publication History

Abstract

Co-located collaborators often work over physical tabletops with rich geospatial information. Previous research shows that people use gestures and speech as they interact with artefacts on the table and communicate with one another. With the advent of large multi-touch surfaces, developers are now applying this knowledge to create appropriate technical innovations in digital table design. Yet they are limited by the difficulty of building a truly useful collaborative application from the ground up. In this paper, we circumvent this difficulty by: (a) building a multimodal speech and gesture engine around the Diamond Touch multi-user surface, and (b) wrapping existing, widely-used off-the-shelf single-user interactive spatial applications with a multimodal interface created from this engine. Through case studies of two quite different geospatial systems -- Google Earth and Warcraft III -- we show the new functionalities, feasibility and limitations of leveraging such single-user applications within a multi user, multimodal tabletop. This research informs the design of future multimodal tabletop applications that can exploit single-user software conveniently available in the market. We also contribute (1) a set of technical and behavioural affordances of multimodal interaction on a tabletop, and (2) lessons learnt from the limitations of single user applications.

References

[1]
Bolt, R. A., Put-that-there: Voice and gesture at the graphics interface. Proc ACM Conf. Computer Graphics and Interactive Techniques Seattle, 1980, 262--270.]]
[2]
Clark, H. Using language. Cambridge Univ. Press, 1996.]]
[3]
Cohen, P. Speech can't do everything: A case for multimodal systems. Speech Technology Magazine, 5(4), 2000.]]
[4]
Cohen, P. R., Coulston, R. and Krout, K., Multimodal interaction during multiparty dialogues: Initial results. Proc IEEE Int'l Conf. Multimodal Interfaces, 2002, 448--452.]]
[5]
Cohen, P. R., Johnston, M., McGee, D., Oviatt, S., Pittman, J., Smith, I., Chen, L. and Clow, J., QuickSet: Multimodal interaction for distributed applications. Proc. ACM Multimedia, 1997, 31--40.]]
[6]
Dietz, P. and Leigh, D. DiamondTouch: a multi-user touch technology. Proc ACM UIST, 2001, 219--226.]]
[7]
Dix, A., Finlay, J. Abowd, G. and Beale, R. Human-Computer Interaction. 2nd ed. Prentice Hall, 1998.]]
[8]
Greenberg, S. and Boyle, M. Customizable physical interfaces for interacting with conventional applications. Proc ACM UIST, 2002, 31--40.]]
[9]
Greenberg, S., Sharing views and interactions with single-user applications. Proc ACM COIS, 1990, 227--237]]
[10]
Greenberg, S. Personalizable groupware: Accommodating individual roles and group differences. Proc ECSCW, 1991, 17--32,]]
[11]
Gutwin, C., and Greenberg, S. The importance of awareness for team cognition in distributed collaboration. In E. Salas, S. Fiore (Eds) Team Cognition: Understanding the Factors that Drive Process and Performance, APA Press, 2004, 177--201.]]
[12]
Gutwin, C. and Greenberg, S. Design for individuals, design for groups: Tradeoffs between power and workspace awareness. Proc ACM CSCW, 1998, 207--216]]
[13]
Heath, C. C. and Luff, P. Collaborative activity and technological design: Task coordination in London Underground control rooms. Proc ECSCW, 1991, 65--80]]
[14]
Ishii, H., Kobayashi, M. and Grudin, J. Integration of interpersonal space and shared workspace: ClearBoard design and experiments. ACM TOIS, 11 (4), 1993, 349--375.]]
[15]
Kruger, R., Carpendale, M. S. T., Scott, S. and Greenberg, S. Roles of orientation in tabletop collaboration: Comprehension, coordination and communication. J CSCW, 13(5--6), 2004, 501--537.]]
[16]
McGee, D. R. and Cohen, P. R., Creating tangible interfaces by augmenting physical objects with multimodal language. Proc ACM Conf Intelligent User Interfaces, 2001, 113--119.]]
[17]
Oviatt, S. L. Ten myths of multimodal interaction, Comm. ACM, 42(11), 1999, 74--81.]]
[18]
Oviatt, S. Multimodal interactive maps: Designing for human performance. Human-Computer Interaction 12, 1997.]]
[19]
Pinelle, D., Gutwin, C. and Greenberg, S. Task analysis for groupware usability evaluation: Modeling shared-workspace tasks with the mechanics of collaboration. ACM TOCHI, 10(4), 2003, 281--311.]]
[20]
Rekimoto, J. SmartSkin: An infrastructure for freehand manipulation on interactive surfaces. Proc ACM CHI, 2002.]]
[21]
Ringel-Morris, M., Ryall, K., Shen, C., Forlines, C., Vernier, F. Beyond social protocols: Multi-user coordination policies for co-located groupware. Proc ACM CSCW, 262--265, 2004.]]
[22]
Segal, L. Effects of checklist interface on non-verbal crew communications, NASA Ames Research Center, Contractor Report 177639, 1994]]
[23]
Tang, J. Findings from observational studies of collaborative work. Int. J. Man-Machine. Studies. 34 (2), 1991, 143--160.]]
[24]
Wigdor, D., Balakrishnan, R. Empirical investigation into the effect of orientation on text readability in tabletop displays. Proc ECSCW, 2005.]]
[25]
Wu, M., Shen, C., Ryall, K., Forlines, C., and Balakrishnan, R. Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces. IEEE Int'l Workshop Horizontal Interactive Human-Computer Systems (TableTop), 2006.]]
[26]
Wu, M. and Balakrishnan, R. Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays. Proc ACM UIST, 193--202, 2003.]]

Cited By

View all
  • (2023)InstruMentAR: Auto-Generation of Augmented Reality Tutorials for Operating Digital Instruments Through Recording Embodied DemonstrationProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581442(1-17)Online publication date: 19-Apr-2023
  • (2022)Iteratively Designing Gesture Vocabularies: A Survey and Analysis of Best Practices in the HCI LiteratureACM Transactions on Computer-Human Interaction10.1145/350353729:4(1-54)Online publication date: 5-May-2022
  • (2022)Reducing the Cognitive Load of Playing a Digital Tabletop Game with a Multimodal InterfaceProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3502062(1-13)Online publication date: 29-Apr-2022
  • Show More Cited By

Index Terms

  1. Enabling interaction with single user applications through speech and gestures on a multi-user tabletop

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AVI '06: Proceedings of the working conference on Advanced visual interfaces
    May 2006
    512 pages
    ISBN:1595933530
    DOI:10.1145/1133265
    • General Chair:
    • Augusto Celentano
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 23 May 2006

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. computer supported cooperative work
    2. multimodal speech and gesture interfaces
    3. tabletop interaction
    4. visual-spatial displays

    Qualifiers

    • Article

    Conference

    AVI06

    Acceptance Rates

    Overall Acceptance Rate 128 of 490 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)12
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 17 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)InstruMentAR: Auto-Generation of Augmented Reality Tutorials for Operating Digital Instruments Through Recording Embodied DemonstrationProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581442(1-17)Online publication date: 19-Apr-2023
    • (2022)Iteratively Designing Gesture Vocabularies: A Survey and Analysis of Best Practices in the HCI LiteratureACM Transactions on Computer-Human Interaction10.1145/350353729:4(1-54)Online publication date: 5-May-2022
    • (2022)Reducing the Cognitive Load of Playing a Digital Tabletop Game with a Multimodal InterfaceProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3502062(1-13)Online publication date: 29-Apr-2022
    • (2022)Tabletop 3D Digital Map Interaction with Virtual Reality Handheld ControllersVirtual, Augmented and Mixed Reality: Design and Development10.1007/978-3-031-05939-1_19(291-305)Online publication date: 16-Jun-2022
    • (2021)Interweaving Multimodal Interaction With Flexible Unit Visualizations for Data ExplorationIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2020.297805027:8(3519-3533)Online publication date: 1-Aug-2021
    • (2021)Affording embodied cognition through touchscreen and above-the-surface gestures during collaborative tabletop science learningInternational Journal of Computer-Supported Collaborative Learning10.1007/s11412-021-09341-xOnline publication date: 13-May-2021
    • (2020)Using Complexity-Identical Human- and Machine-Directed Utterances to Investigate Addressee Detection for Spoken Dialogue SystemsSensors10.3390/s2009274020:9(2740)Online publication date: 11-May-2020
    • (2020)Towards Supporting Collaborative Spatial Planning: Conceptualization of a Maptable Tool through User StoriesISPRS International Journal of Geo-Information10.3390/ijgi90100299:1(29)Online publication date: 3-Jan-2020
    • (2019)Design intervention at major technological installationsJournal of Engineering, Design and Technology10.1108/JEDT-08-2018-012817:2(402-413)Online publication date: 1-Apr-2019
    • (2017)Natural interaction with large map interfaces in VRProceedings of the 21st Pan-Hellenic Conference on Informatics10.1145/3139367.3139424(1-6)Online publication date: 28-Sep-2017
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media