skip to main content
10.1145/2470654.2481352acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Gesture output: eyes-free output using a force feedback touch surface

Published:27 April 2013Publication History

ABSTRACT

We propose using spatial gestures not only for input but also for output. Analogous to gesture input, the proposed gesture output moves the user's finger in a gesture, which the user then recognizes. We use our concept in a mobile scenario where a motion path forming a "5" informs users about new emails, or a heart-shaped path serves as a mes- sage from a friend. We built two prototypes: (1) The long- RangeOuija is a stationary prototype that offers a motion range of up to 4cm; (2) The pocketOuija is self-contained mobile device based on an iPhone with up to 1cm motion range. Both devices actuate the user's fingers by means of an actuated transparent foil overlaid onto a touchscreen. We conducted three studies with the longRangeOuija in which participants recognized 2cm marks with 97% accu- racy, Graffiti digits with 98.8%, pairs of Graffiti digits with 90.5%, and Graffiti letters with 93.4%. Participants previ- ously unfamiliar with Graffiti identified 96.2% of digits and 76.4% of letters, suggesting that properly designed gesture output is guessable. After the experiment, the same participants were able to enter 100% of Graffiti digits by heart and 92.2% of letters. This suggests that participants learned gesture input as a side effect of using gesture output on our prototypes.

Skip Supplemental Material Section

Supplemental Material

chi0102-file3.mp4

mp4

21.2 MB

References

  1. Bragdon, A., Nelson, E. Li, Y., Hinckley, K. Experimental analysis of touch-screen gesture designs in mobile environments. CHI'11, 403--412. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Brave. S., Dahley, A. inTouch: a medium for haptic interpersonal communication. CHI EA'97, 363--364. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Brewster S., Brown, L.M. Tactons: structured tactile messages for non-visual information display. Auic'04, 15--23. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Chubb, E.C, Colgate, J. E., Peshkin, M. A. ShiverPaD: A Glass Haptic Surface That Produces Shear Force on a Bare Finger. Haptic'10, 189--198. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Dang, T., Annaswamy, T.M., Srinivasan, M.A. Development and evaluation of an epidural injection simulator with force feedback for medical training. Medicine Meets VR, 97--102.Google ScholarGoogle Scholar
  6. Enriquez, M., MacLean, K. The hapticon editor: a tool in support ofhapticcommunication research. Haptic'03, 356362. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Feygin, D., Keehner, M., Tendick, F. Haptic Guidance: Experimental Evaluation of a Haptic Training Method for a Perceptual Motor Skill. Haptic'02. 40--47. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Geldard, F.A. 1960. Some neglected Possibilities of Communication, Science 131, 1583--1588.Google ScholarGoogle ScholarCross RefCross Ref
  9. Goldberg, K., Wallace, I. Denta Dentata, SIGGRAPH '93.Google ScholarGoogle Scholar
  10. Israr, A. and Poupyrev, I. Tactile Brush: Drawing on Skin with Tactile Grid Display. CHI'11, 2019--2028. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Kurtenbach, G., Buxton, W. Issues in Combining Marking and Direct Manipulation Techniques. UIST'91, 137--144. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Li, K, A. Designing easily learnable eyes-free interaction, Ph.D. Thesis. 2009.Google ScholarGoogle Scholar
  13. Li, K.A., Baudisch, P., Griswold, W.G., Hollan, J.D. Tapping and rubbing: exploring new dimensions of tactile feedback with voice coil motors. UIST '08, 181--190. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. MacKenzie S., Zhang, S.X. The immediate usability of graffiti. GI'97, 129--137. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Morris, D., Tan, H.Z., Barbagli, F., Chang, T., Salisbury, K., Haptic feedbackenhances force skill learning.WHC'07, 21--26. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Mullenbach, J., D. Johnson, J. E. Colgate, and M. A. Peshkin, ActivePaD surface haptic device", Haptic'02. 414, 2012.Google ScholarGoogle Scholar
  17. Ni, T., Baudisch, P. Disappearing mobile devices. UIST'09, 101. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Noma, H. Miyasato, T., Kishino, F. A palmtop display for dextrous manipulation with haptic sensation. CHI'96, 126133. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Pangaro, G., Maynes-Aminzade, D., Ishii, H. The actuated workbench: computer-controlled actuation in tabletop tangible interfaces. UIST'02, 181--190. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Patten, J., Ishii, H. Mechanical constraints as computational constraints in tabletop tangible interfaces. CHI '07, 809--818. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Poupyrev, I., Maruyama, S., Rekimoto, J. Ambient Touch: Designing tactile interfaces for handhelds. UIST'02. 51--60. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Saponas, T.S., Harrison, C., Benko, H. PocketTouch: throughfabric capacitive touch input. UIST '11, 303--308. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Sato M., Development of String-based Force Display: SPIDAR. VSMM'02. 1034--1039.Google ScholarGoogle Scholar
  24. Seo, J., Choi, S. Initial study for creating linearly moving vibrotactile sensation on mobile device. Haptic'10, 67--70. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Srimathveeravalli G., Thenkurussi, K.Motor Skill Training Assistance Using Haptic Attributes. WHC '05, 452--457. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Tan, H. Z., Durlach, N., Rabinowitz, W. Reed, C.M, Santos, J. Reception of Morse Code through Motional, Vibrotactile and Auditory Stimulation. Perception & Psychophysics, 59, 7, '97.Google ScholarGoogle Scholar
  27. Teo, C., Burdet, E., Lim, H. A robotic teacher of Chinese handwriting. HAPTIC'02, 335--341. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Wang, D., Rossi, M., Tuer, K., Madill, D. Method and System for Providing Haptic Effects. US Patent 20060209037.Google ScholarGoogle Scholar
  29. Wang, D., Tuer, K., Rossi, M., Shu, J. Haptic Overlay Device for Flat Panel Touch Displays. Demo at Haptic'04. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Weiss, M., Schwarz, F., Jakubowski, S., Borchers, J. Madgets: actuating widgets on interactive tabletops. UIST '10, 293--302. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Weiss, M., Wacharamanotham, C., Voelker, S., Borchers, J. FingerFlux: Near-Surface Haptic Feedback on Tabletops. UIST '11, 615--620. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. White, N., Back D. Telephonic Arm Wrestling, Strategic Arts Initiative Symposium (Salerno, Italy, Spring 1986).Google ScholarGoogle Scholar
  33. Williamson, J., Murray-Smith, R., Hughes, S. Shoogle: excitatory multimodal interaction on mobile devices. CHI'07,121124. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Winfield, L., Glassmire, J., Colgate, J. E.. Peshkin. M. T-pad: Tactile pattern display through variable friction reduction. WHC'07, 421--426. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Wobbrock, J.O., Myers, B.A., Kembel, J. EdgeWrite: a stylusbased text entry method designed for high accuracy and stability of motion. UIST '03, 61--70. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Yang, Y., Zhang, Y., Hou, Z., Lemaire-Semail, B. FingViewer: A new multi-touch force feedback touch screen. ICMI'11, 837--838.Google ScholarGoogle Scholar
  37. Yatani, K., Truong, K.N. SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices. UIST '09. ACM, 111--120. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Yokokohji, Y., Hollis, R. L., Kanade, T., Henmi, K., Yoshikawa, T. Toward machine mediated training of motor skillsskill transfer from human to human via virtual environment. In RO-MAN 1996, 32--37.Google ScholarGoogle Scholar

Index Terms

  1. Gesture output: eyes-free output using a force feedback touch surface

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '13: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
      April 2013
      3550 pages
      ISBN:9781450318990
      DOI:10.1145/2470654

      Copyright © 2013 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 27 April 2013

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      CHI '13 Paper Acceptance Rate392of1,963submissions,20%Overall Acceptance Rate6,199of26,314submissions,24%

      Upcoming Conference

      CHI '24
      CHI Conference on Human Factors in Computing Systems
      May 11 - 16, 2024
      Honolulu , HI , USA

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader