skip to main content
10.1145/3027063.3053161acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
abstract

WaveTrace: Motion Matching Input using Wrist-Worn Motion Sensors

Authors Info & Claims
Published:06 May 2017Publication History

ABSTRACT

We present WaveTrace, a novel interaction technique based on selection by motion matching. In motion matching systems, targets move continuously in a singular and pre-defined path -- users interact with these by performing a synchronous bodily movement that matches the movement of one of the targets. Unlike previous work which tracks user input through optical systems, WaveTrace is arguably the first motion matching technique to rely on motion data from inertial measurement units readily available in many wrist-worn wearable devices such as smart watches. To evaluate the technique, we conducted a user study in which we varied: hand; degrees of visual angle; target speed; and number of concurrent targets. Preliminary results indicate that the technique supports up to eight concurrent targets; and that participants could select targets moving at speeds between 180 and 270/s (mean acquisition time of 2237ms, and average success rate of 91%).

Skip Supplemental Material Section

Supplemental Material

lbw0411p.mp4

mp4

3.3 MB

References

  1. M. R. Burke and G. R. Barnes. 2006. Quantitative differences in smooth pursuit and saccadic eye movements. Experimental Brain Research 175, 4: 596--608. https://doi.org/10.1007/s00221-0060576--6 Google ScholarGoogle ScholarCross RefCross Ref
  2. Marcus Carter, Eduardo Velloso, John Downs, Abigail Sellen, Kenton O'Hara, and Frank Vetere. 2016. PathSync: Multi-User Gestural Interaction with Touchless Rhythmic Path Mimicry. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16), 3415-- 3427. https://doi.org/10.1145/2858036.2858284Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Christopher Clarke, Alessio Bellino, Augusto Esteves>, Eduardo Velloso, and Hans Gellersen. 2016. TraceMatch: A Computer Vision Technique for User Input by Tracing of Animated Controls. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16), 298--303. https://doi.org/10.1145/2971648.2971714Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches Using Smooth Pursuit Eye Movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15), 457--466. https://doi.org/10.1145/2807442.2807499Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Jean-Daniel Fekete, Niklas Elmqvist, and Yves Guiard. 2009. Motion-pointing: Target Selection Using Elliptical Motions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09), 289--298. https://doi.org/10.1145/1518701.1518748Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Sandra G. Hart. 2006. Nasa-Task Load Index (NASA-TLX); 20 Years Later. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 50, 9: 904--908. https://doi.org/10.1177/154193120605000909 Google ScholarGoogle ScholarCross RefCross Ref
  7. Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. 1151--1160. https://doi.org/10.1145/2638728.2641695Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Franziska Roesner, Tadayoshi Kohno, and David Molnar. 2014. Security and Privacy for Augmented Reality Systems. Commun. ACM 57, 4: 88--96. https://doi.org/10.1145/2580723.2580730 Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Eduardo Velloso, Markus Wirth, Christian Weichel, Augusto Esteves, and Hans Gellersen. 2016. AmbiGaze: Direct Control of Ambient Devices by Gaze. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems, 812--817. https://doi.org/10.1145/2901790.2901867Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '13), 439-- 448. https://doi.org/10.1145/2493432.2493477Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. John Williamson and Roderick Murray-Smith. 2004. Pointing Without a Pointer. In CHI '04 Extended Abstracts on Human Factors in Computing Systems (CHI EA '04), 1407--1410. https://doi.org/10.1145/985921.986076Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. WaveTrace: Motion Matching Input using Wrist-Worn Motion Sensors

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI EA '17: Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems
      May 2017
      3954 pages
      ISBN:9781450346566
      DOI:10.1145/3027063

      Copyright © 2017 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 6 May 2017

      Check for updates

      Qualifiers

      • abstract

      Acceptance Rates

      CHI EA '17 Paper Acceptance Rate1,000of5,000submissions,20%Overall Acceptance Rate6,164of23,696submissions,26%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader