ABSTRACT
Controlling a virtual character with your free hands is a useful task for many 3D applications such as games, computer puppetry, or emerging virtual reality applications. So far, only specialist controls have been established in the animation industry. Yet little is known about novices mental models for character control, a key to designing widely usable natural and expressive interfaces. To this end we conducted a gesture elicitation study with twelve participants performing mid-air gestures for thirteen given character motions. The mental models observed fall into two distinct categories: 1) external manipulation of an imagined physical puppet and 2) the gesturing hands embodying the motion of the virtual body part being "controlled". The employed mental model determined hand posture and the mental transformation from gesture to character motion. We present and discuss a gesture set that can inform virtual puppetry interfaces for various application domains.
- Thomas Baudel and Michel Beaudouin-Lafon. 1993. Charade: Remote Control of Objects Using Free-hand Gestures. Commun. ACM 36, 7 (July 1993), 28--35. Google ScholarDigital Library
- Richard A. Bolt. 1980. Put-that-there: Voice and Gesture at the Graphics Interface. SIGGRAPH Comput. Graph. 14, 3 (July 1980), 262--270. Google ScholarDigital Library
- Jana Bressem. 2013. A linguistic perspective on the notation of form features in gestures. De Gruyter: Mouton, Berlin, Boston, 1079--1098.Google Scholar
- Liat Clark. 2015. Animating in virtual reality with Oculus Rift. (Friday 8 May 2015). http://www.wired.co.uk/article/animating-in-vr-with-oculusGoogle Scholar
- Niels Henze, Andreas Löcken, Susanne Boll, Tobias Hesselmann, and Martin Pielot. 2010. Free-hand Gestures for Music Playback: Deriving Gestures with a User-centred Process. In Proceedings of the 9th International Conference on Mobile and Ubiquitous Multimedia (MUM '10). ACM, New York, NY, USA, Article 16, 10 pages. Google ScholarDigital Library
- John Jurgensen. 2008. From Muppets to Digital Puppets. http://www.youtube.com/watch?v=GN8WbHomQJg (last accessed July 29, 2019). http://www.youtube.com/watch?v=GN8WbHomQJgGoogle Scholar
- Myron W. Krueger, Thomas Gionfriddo, and Katrin Hinrichsen. 1985. VIDEOPLACE - an Artificial Reality. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '85). ACM, New York, NY, USA, 35--40. Google ScholarDigital Library
- Wai C. Lam, Feng Zou, and Taku Komura. 2004. Motion editing with data glove. In Proceedings of the 2004 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology (ACE '04). ACM, New York, NY, USA, 337--342. Google ScholarDigital Library
- Noah Lockwood and Karan Singh. 2012. Finger Walking: Motion Editing with Contact-Based Hand Performance. In Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation (SCA '12). Eurographics Association, Aire-la-Ville, Switzerland, 43--52. https://doi.org/citation.cfm?id=2421739 Google ScholarDigital Library
- Ali Mazalek, Sanjay Chandrasekharan, Michael Nitsche, Tim Welsh, Paul Clifton, Andrew Quitmeyer, Firaz Peer, Friedrich Kirschner, and Dilip Athreya. 2011. I'M in the Game: Embodied Puppet Interface Improves Avatar Control. In Proceedings of the Fifth International Conference on Tangible, Embedded, and Embodied Interaction (TEI '11). ACM, New York, NY, USA, 129--136. Google ScholarDigital Library
- Alberto Menache. 2011. Understanding Motion Capture for Computer Animation (second ed.). Morgan Kaufmann. Google ScholarDigital Library
- Radu D. Vatavu and Jacob O. Wobbrock. 2015. Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 1325--1334. Google ScholarDigital Library
- Radu D. Vatavu and Ionut A. Zaiti. 2014. Leap Gestures for TV: Insights from an Elicitation Study. In Proceedings of the 2014 ACM International Conference on Interactive Experiences for TV and Online Video (TVX '14). ACM, New York, NY, USA, 131--138. Google ScholarDigital Library
- Jacob O. Wobbrock, Htet H. Aung, Brandon Rothrock, and Brad A. Myers. 2005. Maximizing the Guessability of Symbolic Input. In CHI '05 Extended Abstracts on Human Factors in Computing Systems (CHI EA '05). ACM, New York, NY, USA, 1869--1872. Google ScholarDigital Library
- Jacob O. Wobbrock, Meredith R. Morris, and Andrew D. Wilson. 2009. User-defined gestures for surface computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, New York, NY, USA, 1083--1092. Google ScholarDigital Library
Index Terms
- Embodiment or Manipulation?: Understanding Users' Strategies for Free-Hand Character Control
Recommendations
“I Don’t Want People to Look At Me Differently”: Designing User-Defined Above-the-Neck Gestures for People with Upper Body Motor Impairments
CHI '22: Proceedings of the 2022 CHI Conference on Human Factors in Computing SystemsRecent research proposed eyelid gestures for people with upper-body motor impairments (UMI) to interact with smartphones without finger touch. However, such eyelid gestures were designed by researchers. It remains unknown what eyelid gestures people with ...
Memorability of pre-designed and user-defined gesture sets
CHI '13: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsWe studied the memorability of free-form gesture sets for invoking actions. We compared three types of gesture sets: user-defined gesture sets, gesture sets designed by the authors, and random gesture sets in three studies with 33 participants in total. ...
Tactile Animation by Direct Manipulation of Grid Displays
UIST '15: Proceedings of the 28th Annual ACM Symposium on User Interface Software & TechnologyChairs, wearables, and handhelds have become popular sites for spatial tactile display. Visual animators, already expert in using time and space to portray motion, could readily transfer their skills to produce rich haptic sensations if given the right ...
Comments