skip to main content
10.1145/2971763.2971765acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Whoosh: non-voice acoustics for low-cost, hands-free, and rapid input on smartwatches

Published:12 September 2016Publication History

ABSTRACT

We present an alternate approach to smartwatch interactions using non-voice acoustic input captured by the device's microphone to complement touch and speech. Whoosh is an interaction technique that recognizes the type and length of acoustic events performed by the user to enable low-cost, hands-free, and rapid input on smartwatches. We build a recognition system capable of detecting non-voice events directed at and around the watch, including blows, sip-and-puff, and directional air swipes, without hardware modifications to the device. Further, inspired by the design of musical instruments, we develop a custom modification of the physical structure of the watch case to passively alter the acoustic response of events around the bezel; this physical redesign expands our input vocabulary with no additional electronics. We evaluate our technique across 8 users with 10 events exhibiting up to 90.5% ten-fold cross validation accuracy on an unmodified watch, and 14 events with 91.3% ten-fold cross validation accuracy with an instrumental watch case. Finally, we share a number of demonstration applications, including multi-device interactions, to highlight our technique with a real-time recognizer running on the watch.

Skip Supplemental Material Section

Supplemental Material

References

  1. B. Amento, W. Hill, and L. Terveen. The Sound of One Hand: A Wrist-mounted Bio-acoustic Fingertip Gesture Interface. CHI EA '02. 724--725. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. D. L. Ashbrook. 2010. Enabling Mobile Microinteractions. Ph.D. Dissertation. Georgia Institute of Technology, Atlanta, GA, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. W.-H. Chen. Blowatch: Blowable and Hands-Free Interaction for Smartwatches. CHI EA '15. 103--108. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. X. A. Chen, T. Grossman, D. J. Wigdor, and G. Fitzmaurice. Duet: Exploring Joint Interactions on a Smart Phone and a Smart Watch. CHI '14. 159--168. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. A. Dementyev and J. A. Paradiso. WristFlex: Low-power Gesture Input with Wrist-worn Pressure Sensors. UIST '14. 161--166. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. T. Deyle, S. Palinko, E. S. Poole, and T. Starner. Hambone: A Bio-Acoustic Gesture Interface. ISWC '07. 3--10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. L. Fehr, W. E. Langbein, and S. B. Skaar. Adequacy of Power Wheelchair Control Interfaces for Persons with Severe Disabilities: A Clinical Survey. J. of Rehab R&D '00. 353--360.Google ScholarGoogle Scholar
  8. J. F. Filho, W. Prata, and T. Valle. Advances on Breathing Based Text Input for Mobile Devices. Universal Access in Human-Computer Interaction '15. 279--287.Google ScholarGoogle Scholar
  9. M. Gordon, T. Ouyang, and S. Zhai. WatchWriter: Tap and Gesture Typing on a Smartwatch Miniature Keyboard with Statistical Decoding. CHI '16. 3817--3821. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. S. Harada, J. A. Landay, J. Malkin, X. Li, and J. A. Bilmes. The Vocal Joystick:: Evaluation of Voice-based Cursor Control Techniques. Assets '06. 197--204. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. C. Harrison, D. Tan, and D. Morris. Skinput: Appropriating the Body as an Input Surface. CHI '10. 453--462. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. K. Hinckley, G. Ramos, F. Guimbretiere, P. Baudisch, and M. Smith. Stitching: Pen Gestures That Span Multiple Displays. AVI '04. 23--31. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. T. Igarashi and J. F. Hughes. Voice As Sound: Using Non-verbal Voice Input for Interactive Control. UIST '01. 155--156. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. A. K. Karlson and B. B. Bederson. ThumbSpace: Generalized One-Handed Input for Touchscreen-Based Mobile Devices. INTERACT '07. 324--338. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. D. Kim, O. Hilliges, S. Izadi, A. D. Butler, J. Chen, I. Oikonomidis, and P. Olivier. Digits: Freehand 3D Interactions Anywhere Using a Wrist-worn Gloveless Sensor. UIST '12. 167--176. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. G. Laput, E. Brockmeyer, S. E. Hudson, and C. Harrison. Acoustruments: Passive, Acoustically-Driven, Interactive Controls for Handheld Devices. CHI '15. 2161--2170. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. G. Laput, R. Xiao, X. A. Chen, S. E. Hudson, and C. Harrison. Skin Buttons: Cheap, Small, Low-powered and Clickable Fixed-icon Laser Projectors. UIST '14. 389--394. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. K. Lyons, D. Nguyen, D. Ashbrook, and S. White. Facet: A Multi-segment Wrist Worn System. UIST '12. 123--130. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. I. Oakley and D. Lee. Interaction on the Edge: Offset Sensing for Small Devices. CHI '14. 169--178. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. A. Olwal and S. Feiner. Interaction Techniques Using Prosodic Features of Speech and Audio Localization. IUI '05. 284--286. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. S. N. Patel and G. D. Abowd. BLUI: Low-Cost Localized Blowable User Interfaces. UIST '07. 217--220. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. S. T. Perrault, E. Lecolinet, J. Eagan, and Y. Guiard. Watchit: Simple Gestures and Eyes-free Interaction for Wristwatches and Bracelets. CHI '13. 1451--1460. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. J. Rekimoto. GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices. ISWC '01. 21--27. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. J. Ruiz and Y. Li. DoubleFlip: A Motion Gesture Delimiter for Mobile Interaction. CHI '11. 2717--2720. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. D. Sakamoto, T. Komatsu, and T. Igarashi. Voice Augmented Manipulation: Using Paralinguistic Information to Manipulate Mobile Devices. MobileHCI '13. 69--78. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. T. S. Saponas, D. S. Tan, D. Morris, R. Balakrishnan, J. Turner, and J. A. Landay. Enabling Always-Available Input with Muscle-Computer Interfaces. UIST '09. 167--176. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. K. A. Siek, Y. Rogers, and K. H. Connelly. Fat Finger Worries: How Older and Younger Users Physically Interact with PDAs. INTERACT '05. 267--280. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. A. J. Sporka, S. H. Kurniawan, M. Mahmud, and P. Slavik. Non-speech Input and Speech Recognition for Real-time Control of Computer Games. Assets '06. 213--220. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. H. Wallop. CES 2010: breath-controlled mobile phones to be made? Telegraph.co.uk. Accessed: 2016-07-14. 2010.Google ScholarGoogle Scholar
  30. R. Xiao, G. Laput, and C. Harrison. Expanding the Input Expressivity of Smartwatches with Mechanical Pan, Twist, Tilt and Click. CHI '14. 193--196. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Whoosh: non-voice acoustics for low-cost, hands-free, and rapid input on smartwatches

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ISWC '16: Proceedings of the 2016 ACM International Symposium on Wearable Computers
      September 2016
      207 pages
      ISBN:9781450344609
      DOI:10.1145/2971763

      Copyright © 2016 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 12 September 2016

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      ISWC '16 Paper Acceptance Rate18of95submissions,19%Overall Acceptance Rate38of196submissions,19%

      Upcoming Conference

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader