skip to main content
10.1145/2971485.2971502acmotherconferencesArticle/Chapter ViewAbstractPublication PagesnordichiConference Proceedingsconference-collections
research-article

Insights on the Impact of Physical Impairments in Full-Body Motion Gesture Elicitation Studies

Published: 23 October 2016 Publication History

Abstract

Elicitation studies are becoming recently popular methodology to investigate novel gestural interfaces. Yet, little is known about possible factors that may influence this type of studies. To our knowledge, this paper is the first to investigate the impact of physical impairments in full-body motion gesture elicitation studies. Our study was conducted with 20 healthy and 12 arm and/or hand impaired voluntary participants undergoing rehabilitation. In total, 1,707 gestures were logged, analyzed, and paired with think-aloud data for 27 referents performed with and without imposed physical constrains. Our findings supported by observational analyses aim to reveal the challenges to achieve a single canonical gesture set, the most popular strategies for defining gestures, the variation of physical body engagement with full-body motion gestures, and the tendency towards personalized and hybrid gestures. Our results add to few existing research papers aiming for better understanding of potential shortcomings of end-user interaction elicitation.

References

[1]
Alexander, J., Han, T., Judd, W., Irani, P., and Subramanian, S. Putting your best foot forward: investigating real-world mappings for foot-based gestures. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '12, ACM (New York, NY, USA, 2012), 1229--1238.
[2]
Altakrouri, B. Ambient Assisted Living with Dynamic Interaction Ensembles. Doctoral thesis, The Department of Computer Sciences/Engineering), University of Luebeck, Luebeck, Germany, August 31 2014.
[3]
Cafaro, F., Lyons, L., Kang, R., Radinsky, J., Roberts, J., and Vogt, K. Framed guessability: Using embodied allegories to increase user agreement on gesture sets. In Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction, TEI '14, ACM (New York, NY, USA, 2014), 197--204.
[4]
Daiber, F., Schöning, J., and Krüger, A. Towards a framework for whole body interaction with geospatial data. In Whole Body Interaction, D. England, Ed., Human-Computer Interaction Series. Springer London, 2011, 197--207.
[5]
Fetter, M., and Gross, T. Empfehlungen für die gestaltung von erratbarkeitsstudien zur ermittlung von benutzerdefinierten gesten. Mensch und Computer 2014--Tagungsband: 14. Fachübergreifende Konferenz für Interaktive und Kooperative Medien--Interaktiv unterwegs-Freiräume gestalten (2014), 245.
[6]
Fogtmann, M. H., Fritsch, J., and Kortbek, K. J. Kinesthetic interaction: revealing the bodily potential in interaction design. In Proceedings of the 20th Australasian Conference on Computer-Human Interaction: Designing for Habitus and Habitat, OZCHI '08, ACM (New York, NY, USA, 2008), 89--96.
[7]
Gerling, K., Livingston, I., Nacke, L., and Mandryk, R. Full-body motion-based game interaction for older adults. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '12, ACM (New York, NY, USA, 2012), 1873--1882.
[8]
Kipp, M. ANVIL: The Video Annotation Research Tool. In Handbook of Corpus Phonology, J. Durand, U. Gut, and G. Kristoffersen, Eds., Oxford University Press (2014), 420--436.
[9]
Mason, J. Qualitative Researching. SAGE Publications, 2002.
[10]
Moen, J. From hand-held to body-worn: Embodied experiences of the design and use of a wearable movement-based interaction concept. In Proceedings of the 1st International Conference on Tangible and Embedded Interaction, TEI '07, ACM (New York, NY, USA, 2007), 251--258.
[11]
Morris, M. R. Web on the wall: Insights from a multimodal interaction elicitation study. Proceedings of Interactive Tabletops and Surfaces (November 2012).
[12]
Morris, M. R., Danielescu, A., Drucker, S., Fisher, D., Lee, B., m.c. schraefel, and Wobbrock, J. O. Reducing legacy bias in gesture elicitation studies. ACM Interactions Magazine (May 2014).
[13]
Morris, M. R., Wobbrock, J. O., and Wilson, A. D. Understanding users' preferences for surface gestures. In Proceedings of Graphics Interface 2010, GI '10, Canadian Information Processing Society (Toronto, Ont., Canada, Canada, 2010), 261--268.
[14]
Piper, A. M., Campbell, R., and Hollan, J. D. Exploring the accessibility and appeal of surface computing for older adult health care support. In Proceedings of the 28th International Conference on Human Factors in Computing Systems, CHI '10, ACM (New York, NY, USA, 2010), 907--916.
[15]
Rico, J., and Brewster, S. Usable gestures for mobile interfaces: evaluating social acceptability. New York, NY, USA, 2010.
[16]
Ruiz, J., Li, Y., and Lank, E. User-defined motion gestures for mobile interaction. In Proceedings of the 2011 annual conference on Human factors in computing systems, CHI '11, ACM (New York, NY, USA, 2011), 197--206.
[17]
Ruiz, J., and Vogel, D. Soft-constraints to reduce legacy and performance bias to elicit whole-body gestures with low arm fatigue. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI '15, ACM (New York, NY, USA, 2015), 3347--3350.
[18]
Scoditti, A., Blanch, R., and Coutaz, J. A novel taxonomy for gestural interaction techniques based on accelerometers. In the 15th International Conference on Intelligent User Interfaces (IUI '11), ACM (New York, NY, USA, 2011), 63--72.
[19]
Vatavu, R.-D., and Wobbrock, J. O. Formalizing agreement analysis for elicitation studies: New measures, significance test, and toolkit. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI '15, ACM (New York, NY, USA, 2015), 1325--1334.
[20]
Wobbrock, J. O., Aung, H. H., Rothrock, B., and Myers, B. A. Maximizing the guessability of symbolic input. In CHI '05 Extended Abstracts on Human Factors in Computing Systems, CHI EA '05, ACM (New York, NY, USA, 2005), 1869--1872.
[21]
Wobbrock, J. O., Morris, M. R., and Wilson, A. D. User-defined gestures for surface computing. In Proceedings of the 27th international conference on Human factors in computing systems, CHI '09, ACM (New York, NY, USA, 2009), 1083--1092.
[22]
Yee, C. S. M. Advanced and natural interaction system for motion-impaired users. PhD thesis, Departament de Ciencies Matematiques i Informatica, Universitat de les Illes Balears, Spain, 2009.

Cited By

View all
  • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
  • (2023)How Do People with Limited Movement Personalize Upper-Body Gestures? Considerations for the Design of Personalized and Accessible Gesture InterfacesProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3608430(1-15)Online publication date: 22-Oct-2023
  • (2022)Theoretically-Defined vs. User-Defined Squeeze GesturesProceedings of the ACM on Human-Computer Interaction10.1145/35678056:ISS(73-102)Online publication date: 14-Nov-2022
  • Show More Cited By

Index Terms

  1. Insights on the Impact of Physical Impairments in Full-Body Motion Gesture Elicitation Studies

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    NordiCHI '16: Proceedings of the 9th Nordic Conference on Human-Computer Interaction
    October 2016
    1045 pages
    ISBN:9781450347631
    DOI:10.1145/2971485
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 23 October 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Gestures
    2. elicitation
    3. multimodal input
    4. participatory design
    5. user-defined gestures

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    • German Federal Ministry of Education and Research (BMBF)
    • Association for the Advancement of Rehabilitation Research in Hamburg, Mecklenburg-Vorpommern und Schleswig-Holstein (VFFR)

    Conference

    NordiCHI '16

    Acceptance Rates

    NordiCHI '16 Paper Acceptance Rate 58 of 231 submissions, 25%;
    Overall Acceptance Rate 379 of 1,572 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)4
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 03 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
    • (2023)How Do People with Limited Movement Personalize Upper-Body Gestures? Considerations for the Design of Personalized and Accessible Gesture InterfacesProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3608430(1-15)Online publication date: 22-Oct-2023
    • (2022)Theoretically-Defined vs. User-Defined Squeeze GesturesProceedings of the ACM on Human-Computer Interaction10.1145/35678056:ISS(73-102)Online publication date: 14-Nov-2022
    • (2022)The Impacts of Referent Display on Gesture and Speech ElicitationIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.320309028:11(3885-3895)Online publication date: Nov-2022
    • (2020)Understanding Multimodal User Gesture and Speech Behavior for Object Manipulation in Augmented Reality Using ElicitationIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2020.302356626:12(3479-3489)Online publication date: Dec-2020

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media