skip to main content
10.1145/2639189.2670267acmotherconferencesArticle/Chapter ViewAbstractPublication PagesnordichiConference Proceedingsconference-collections
research-article

Evaluating multimodal interaction with gestures and speech for point and select tasks

Published:26 October 2014Publication History

ABSTRACT

Natural interactions such as speech and gestures have achieved mainstream success independently, with consumer products such as Leap Motion popularizing gestures, while mobile phones have embraced speech input. In this paper we designed an interaction style that combines both gestures and speech to evaluate point and select interaction. Our results indicate that while gestures are slower than the mouse, the introduction of speech allows for selection to be performed without negatively impacting navigation. We also found that users can adapt to this interaction quickly and are able to improve their performance with minimal training. This lays the foundation for future work, such as mouse replacement technologies for those with hand impairments.

References

  1. Hauptmann, A. G., and McAvinney, P. Gestures with speech for graphic manipulation. International Journal of Man-Machine Studies 38, 2 (1993), 231--249. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Jude, A., Poor, G. M., and Guinness, D. Personal space: User defined gesture space for gui interaction. In CHI '14 Extended Abstracts on Human Factors in Computing Systems, CHI EA '14, ACM (New York, NY, USA, 2014), 1615--1620. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Kopp, S. Giving interaction a hand: Deep models of co-speech gesture in multimodal systems. In Proceedings of the 15th ACM on International Conference on Multimodal Interaction, ICMI '13, ACM (New York, NY, USA, 2013), 245--246. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Oviatt, S. User-centered modeling for spoken language and multimodal interfaces. IEEE multimedia 3, 4 (1996), 26--35. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Shneiderman, B. The limits of speech recognition. Commun. ACM 43, 9 (Sept. 2000), 63--65. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Soukoreff, R. W., and MacKenzie, I. S. Towards a standard for pointing device evaluation, perspectives on 27 years of fitts law research in hci. International Journal of Human-Computer Studies 61, 6 (2004), 751--789. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Evaluating multimodal interaction with gestures and speech for point and select tasks

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      NordiCHI '14: Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational
      October 2014
      361 pages
      ISBN:9781450325424
      DOI:10.1145/2639189

      Copyright © 2014 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 26 October 2014

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      NordiCHI '14 Paper Acceptance Rate89of361submissions,25%Overall Acceptance Rate379of1,572submissions,24%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader