skip to main content
10.1145/2493432.2493480acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Towards more natural digital content manipulation via user freehand gestural interaction in a living room

Published: 08 September 2013 Publication History

Abstract

Advances in dynamic gesture recognition technologies now make it possible to investigate freehand input techniques. This study tried to understand how users manipulate digital content on a distant screen by hand gesture interaction in a living room environment. While there have been many existing studies that investigate freehand input techniques, we developed and applied a novel study methodology based on a combination of both an existing user elicitation study and conventional Wizard-of-Oz study that involved another non-technical user for providing feedback. Through the study, many useful issues and implications for making freehand gesture interaction design more natural in a living room environment were generated which have not been covered in previous works. Furthermore, we could observe how the initial user-defined gestures are changed over time.

References

[1]
Baudel, T. and Beaudouin-Lafon, M. (1993) Charade: Remote control of objects using free-hand gestures. Communications of the ACM 36 (7), 28--35.
[2]
Benford S, Schnädelbach H, Koleva B, Anastasi R, Greenhalgh, C, Rodden T, Green J, Ghali A, Pridmore T, Gaver B, Boucher A, Walker B, Pennington S, Schmidt A, Gellersen, H, Steed A (2005) Expected, sensed, and desired: a framework for designing sensing-based interaction. ACM TransComput--Hum Interact 12(1):3--30
[3]
Bolt, R. (1980). Put-that-there: Voice and gesture at the graphics interface. Computer Graphics, 14(3). p. 262--270.
[4]
Bowman, D. and Hodges, L. (1997). An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments. ACM Symposium on Interactive 3D Graphics. p. 35--38.
[5]
Burns C, Dishman E, Verplank B, Lassiter B. Actors, hair-dos and videotape: informance design; using perofrmance techniques in multi-disciplinary, observation based design. CHI'94
[6]
Chong, M., and Gellersen, H. How Users Associate Wireless Devices. Proc. CHI 201 ACM Press (2011)
[7]
Corradini, A. and Cohen, P.R. O the Relationships Among Speech, Gestures, and Object Manipulation in Virtual Environments: Initial Evidence. In Proc. of the International CLASS Workshop on Natural, Intelligent and Effective Interaction in Multimodal Dialogue ystems, Copenhagen, Denmark, (2002).
[8]
Dahlbäck, N., Jönsson, A. and Ahrenberg, L. Wizard of Oz Studies: Why and How. In Proc. IUI 1993, ACM Press (1993), 193--200.
[9]
Döring, T. et al. Gestural Interaction on the Steering Wheel --Reducing the Visual Demand. Proc. CHI 2011. ACM Press (2011)
[10]
Epps, J., Lichman, S. and Wu, M. (2006) A study of hand shape use in tabletop gesture interaction. Ext. Abstracts CHI '06. New York: ACM Press, 748--753.
[11]
Fikkert, W., Vet, P., Veer, G., and Nijholt, A. Gestures for Large Display Control. Gesture in Embodied Communication and Human-computer Interaction. Lecture Notes in Computer Science, 2010, Volume 5934/2010, 245--256.
[12]
Foley, J.D., van Dam, A., Feiner, S.K. and Hughes, J.F. (1996) The form and content of user-computer dialogues. In Computer Graphics: Principles and Practice. Reading, MA: Addison-Wesley, 392--395.
[13]
Grandhi, S. A., Joue, G., and Mittelberg, I. Understanding Naturalness and Intuitiveness in Gesture Production: Insights for Touchless Gestural Interfaces. Proc. CHI 2011. ACM Press (2011),
[14]
Grossman, T., Wigdor, D., and Balakrishnan, R. (2004). Multi finger gestural interaction with 3D volumetric displays. ACM UIST Symposium. p. 61--70.
[15]
Henze, N., Locken, A., Boll, S., Hesselmann, T., and Pielot, M. Free-hand gestures for music playback: deriving gestures with a user-centred process. In Proceedings of the 9th International Conference on Mobile and Ubiquitous Multimedia, 2010.
[16]
Hinckley, K., Pausch, R., Goble, J., and Kassell, N. A survey of design issues in spatial input. In Proc. UIST '94, 213--222. ACM, 1994.
[17]
Hopmann, M. et al. Natural Activation for Gesture Recognition Systems. Expended Abstract Proc. CHI 2011. ACM Press (2011)
[18]
Hutchins, E.L., Hollan, J.D. and Norman, D.A. (1985) Direct manipulation interfaces. Human-Computer Interaction 1 (4), 311--388.
[19]
Kane, S.K., Wobbrock, J.O., and Ladner, R.E. Usable gestures for blind people: understanding preference and performance. CHI '11, 413--422.
[20]
Kim, K., Joo, D., and Lee, K. Wearable-object-based interaction for a mobile audio device. In Proc. CHI EA '10, pages 3865--3870, Apr 2010.
[21]
Kühnela C., Westermannb, T., Hemmerta, F., and Möller, S. I'm home: Defining and evaluating a gesture set for smart-home control. International Journal of Human-Computer Studies. 69 (11)
[22]
Lee, S., Kim, S., Jin, B., Choi, E., Kim, B., Jia, X., Kim, D., and Lee, K. 2010. How users manipulate deformable displays as input devices. In Proc. of CHI '10, ACM, 1647--1656, 2010.
[23]
Mauney, D., Howarth, J., Wirtanen, A., and Carpra, M. Cultural Similarities and Differences in User-Defined Gestures for Touchscreen User Interfaces. Proc. CHI EA 2010,ACM Press (2010), 4015--4020.
[24]
McNeill, D. Hand and mind: What gestures reveal about thought. University of Chicago Press, Chicago, 1992.
[25]
Morris, M.R., Wobbrock, J.O., Wilson, A.D. Understanding Users' Preferences for Surface Gestures. Proc. of Graphics Interface (GI '10) (2010), 261--268.
[26]
Nancel, M., Wagner, J., Pietriga, E., Chapuis, O., and Mackay, W. Mid-air pan-and-zoom on wall-sized displays. In Proc. CHI '11. ACM, 2011. In press.
[27]
Ni, T., McMahan, R., and Bowman, D. (2008). rapMenu: Remote Menu Selection Using Freehand Gestural Input. IEEE Symposium on 3D User Interfaces, Reno, Nevada, 55--58.
[28]
Ni, T., Bowman, D., North, C. AirStroke: Bringing Unistroke Text Entry to Freehand Gesture Interfaces. Proc. CHI 2011. ACM Press (2011)
[29]
Nickel, K. and Stiefelhagen, R. (2003). Pointing gesture recognition based on 3D-tracking of face, hands and head orientation. International Conference on Multimodal Interfaces. p. 140--146.
[30]
Nielsen, M., Störring, M., Moeslund, T.B. and Granum, E (2004) A procedure for developing intuitive and ergonomic gesture interfaces for HCI. Int'l Gesture Workshop 2003, LNCS vol. 2915. Heidelberg: Springer-Verlag, 409--420.
[31]
Nijholt, A., Reidsma, D., Poppe, R.: Games and entertainment in ambient intelligence environments. In: Aghajan, H., Delgado, R., Augusto, J.C. (eds.) HumanCentric Interfaces for Ambient Intelligence. Elsevier, Amsterdam (2009)
[32]
Norman, D. A.: The way I see it: Natural user interfaces are not natural. interactions 17, 3, ACM, 6--10, 2010.
[33]
Poupyrev, I. and Ichikawa, T. (1999). Manipulating objects in virtual worlds: categorization and empirical evaluation of interaction techniques. Journal of Visual Languages and Computing, 10. p. 19--35.
[34]
Rico, J. and Brewster, S.A. Usable Gestures for Mobile Interfaces: Evaluating Social Acceptability. Proc. Of CHI2010, ACM Press (2010).
[35]
Ruiz, J., Li, Y., and Lank, E. User-Defined Motion Gestures for Mobile Interaction. Proc. CHI 2011. ACM Press (2011)
[36]
Schuler, D. and Namioka, A. (1993) Participatory Design: Principles and Practices. Hillsdale, NJ: Lawrence Erlbaum.
[37]
Vogel, D., and Balakrishnan, R. Distant freehand pointing and clicking on very large, high resolution displays. In Proc. UIST'05, 33--42. ACM, 2005.
[38]
Voida, S., Podlaseck, M., Kjeldsen, R. and Pinhanez, C. (2005) in a projector/camera-based augmented reality environment. Proc. CHI '05. New York: ACM Press, 611--620.
[39]
Wilson, A. (2006) Robust Computer Vision-Based Detection of Pinching for One and Two-Handed Gesture Input. In Proceedings of the ACM Symposium on User Interface Software and Technology (UIST). p. 255--258.
[40]
Wobbrock, J.O., Morris, M.R., Wilson, A.D. Userdefined gestures for surface computing. Proc. CHI 2009. ACM Press (2009), 1083--1092.
[41]
Yoo, J., Kim, S., and Seo, D. Zapping Delay Reduction Method for Sports Live with Multi-angle on Smart TV. In proc. SAC '06 (2006)
[42]
Zigelbaum, J., Browning, A., Leithinger, D., Bau, O., and Ishii, H. g-stalt: a chirocentric, spatiotemporal, and telekinetic gestural interface. In Proc. TEI '10, 261--264. ACM, 2010

Cited By

View all
  • (2024)User Preferences for Interactive 3D Object Transitions in Cross Reality - An Elicitation StudyProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656698(1-9)Online publication date: 3-Jun-2024
  • (2024)Controlling the Rooms: How People Prefer Using Gestures to Control Their Smart HomesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642687(1-18)Online publication date: 11-May-2024
  • (2024)Better to Ask Than Assume: Proactive Voice Assistants’ Communication Strategies That Respect User Agency in a Smart Home EnvironmentProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642193(1-17)Online publication date: 11-May-2024
  • Show More Cited By

Index Terms

  1. Towards more natural digital content manipulation via user freehand gestural interaction in a living room

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UbiComp '13: Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing
    September 2013
    846 pages
    ISBN:9781450317702
    DOI:10.1145/2493432
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 08 September 2013

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. design method.
    2. gesture interaction
    3. interaction design

    Qualifiers

    • Research-article

    Conference

    UbiComp '13
    Sponsor:

    Acceptance Rates

    UbiComp '13 Paper Acceptance Rate 92 of 394 submissions, 23%;
    Overall Acceptance Rate 764 of 2,912 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)56
    • Downloads (Last 6 weeks)4
    Reflects downloads up to 05 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)User Preferences for Interactive 3D Object Transitions in Cross Reality - An Elicitation StudyProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656698(1-9)Online publication date: 3-Jun-2024
    • (2024)Controlling the Rooms: How People Prefer Using Gestures to Control Their Smart HomesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642687(1-18)Online publication date: 11-May-2024
    • (2024)Better to Ask Than Assume: Proactive Voice Assistants’ Communication Strategies That Respect User Agency in a Smart Home EnvironmentProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642193(1-17)Online publication date: 11-May-2024
    • (2024)Take a Seat, Make a Gesture: Charting User Preferences for On-Chair and From-Chair Gesture InputProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642028(1-17)Online publication date: 11-May-2024
    • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
    • (2023)Toward a Multilingual Conversational Agent: Challenges and Expectations of Code-mixing Multilingual UsersProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581445(1-17)Online publication date: 19-Apr-2023
    • (2023)Towards a Consensus Gesture Set: A Survey of Mid-Air Gestures in HCI for Maximized Agreement Across DomainsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581420(1-24)Online publication date: 19-Apr-2023
    • (2023)Factors Affecting the Results of Gesture Elicitation: A Review2023 11th International Conference in Software Engineering Research and Innovation (CONISOFT)10.1109/CONISOFT58849.2023.00030(169-176)Online publication date: 6-Nov-2023
    • (2022)Iteratively Designing Gesture Vocabularies: A Survey and Analysis of Best Practices in the HCI LiteratureACM Transactions on Computer-Human Interaction10.1145/350353729:4(1-54)Online publication date: 5-May-2022
    • (2022)ProGesAR: Mobile AR Prototyping for Proxemic and Gestural Interactions with Real-world IoT Enhanced SpacesProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517689(1-14)Online publication date: 29-Apr-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media