ABSTRACT
We present WaveTrace, a novel interaction technique based on selection by motion matching. In motion matching systems, targets move continuously in a singular and pre-defined path -- users interact with these by performing a synchronous bodily movement that matches the movement of one of the targets. Unlike previous work which tracks user input through optical systems, WaveTrace is arguably the first motion matching technique to rely on motion data from inertial measurement units readily available in many wrist-worn wearable devices such as smart watches. To evaluate the technique, we conducted a user study in which we varied: hand; degrees of visual angle; target speed; and number of concurrent targets. Preliminary results indicate that the technique supports up to eight concurrent targets; and that participants could select targets moving at speeds between 180 and 270/s (mean acquisition time of 2237ms, and average success rate of 91%).
Supplemental Material
- M. R. Burke and G. R. Barnes. 2006. Quantitative differences in smooth pursuit and saccadic eye movements. Experimental Brain Research 175, 4: 596--608. https://doi.org/10.1007/s00221-0060576--6 Google ScholarCross Ref
- Marcus Carter, Eduardo Velloso, John Downs, Abigail Sellen, Kenton O'Hara, and Frank Vetere. 2016. PathSync: Multi-User Gestural Interaction with Touchless Rhythmic Path Mimicry. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16), 3415-- 3427. https://doi.org/10.1145/2858036.2858284Google ScholarDigital Library
- Christopher Clarke, Alessio Bellino, Augusto Esteves>, Eduardo Velloso, and Hans Gellersen. 2016. TraceMatch: A Computer Vision Technique for User Input by Tracing of Animated Controls. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16), 298--303. https://doi.org/10.1145/2971648.2971714Google ScholarDigital Library
- Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches Using Smooth Pursuit Eye Movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15), 457--466. https://doi.org/10.1145/2807442.2807499Google ScholarDigital Library
- Jean-Daniel Fekete, Niklas Elmqvist, and Yves Guiard. 2009. Motion-pointing: Target Selection Using Elliptical Motions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09), 289--298. https://doi.org/10.1145/1518701.1518748Google ScholarDigital Library
- Sandra G. Hart. 2006. Nasa-Task Load Index (NASA-TLX); 20 Years Later. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 50, 9: 904--908. https://doi.org/10.1177/154193120605000909 Google ScholarCross Ref
- Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. 1151--1160. https://doi.org/10.1145/2638728.2641695Google ScholarDigital Library
- Franziska Roesner, Tadayoshi Kohno, and David Molnar. 2014. Security and Privacy for Augmented Reality Systems. Commun. ACM 57, 4: 88--96. https://doi.org/10.1145/2580723.2580730 Google ScholarDigital Library
- Eduardo Velloso, Markus Wirth, Christian Weichel, Augusto Esteves, and Hans Gellersen. 2016. AmbiGaze: Direct Control of Ambient Devices by Gaze. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems, 812--817. https://doi.org/10.1145/2901790.2901867Google ScholarDigital Library
- Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '13), 439-- 448. https://doi.org/10.1145/2493432.2493477Google ScholarDigital Library
- John Williamson and Roderick Murray-Smith. 2004. Pointing Without a Pointer. In CHI '04 Extended Abstracts on Human Factors in Computing Systems (CHI EA '04), 1407--1410. https://doi.org/10.1145/985921.986076Google ScholarDigital Library
Index Terms
- WaveTrace: Motion Matching Input using Wrist-Worn Motion Sensors
Recommendations
Designing Motion Matching for Real-World Applications: Lessons from Realistic Deployments
TEI '19: Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied InteractionAmongst the variety of (multi-modal) interaction techniques that are being developed and explored, the Motion Matching paradigm provides a novel approach to selection and control. In motion matching, users interact by rhythmically moving their bodies to ...
Smart Home Control using Motion Matching and Smart Watches
ISS '17: Proceedings of the 2017 ACM International Conference on Interactive Surfaces and SpacesThis paper presents a prototype of a smart home control system operated through motion matching input. In motion matching, targets move continuously in a singular and pre-defined path; users interact with these targets by tracking their movement for a ...
SmoothMoves: Smooth Pursuits Head Movements for Augmented Reality
UIST '17: Proceedings of the 30th Annual ACM Symposium on User Interface Software and TechnologySmoothMoves is an interaction technique for augmented reality (AR) based on smooth pursuits head movements. It works by computing correlations between the movements of on-screen targets and the user's head while tracking those targets. The paper ...
Comments