ABSTRACT
This paper investigates users' ability to perform force-sensitive tapping and explores its potential as an input modality in touch-based systems. We study force-sensitive tapping using Expressive Touch, a tabletop interface that infers tapping force from the sound waves created by the users' finger upon impact. The first part of the paper describes the implementation details of Expressive Touch and shows how existing tabletop interfaces can be augmented to reliably detect tapping force across the entire surface. The second part of the paper reports on the results of three studies of force-sensitive tapping. First, we use a classic psychophysic task to gain insights into participants' perception of tapping force (Study 1). Results show that although participants tap with different absolute tapping forces, they have a similar perception of relative tapping force. Second, we investigate participants' ability to control tapping force (Study 2) and find that users can produce two force levels with 99% accuracy. For six levels of force, accuracy drops to 58%. Third, we investigate the usability of force tapping by studying participants' reactions to seven force-sensitive touch applications (Study 3).
Supplemental Material
- S. Bergweiler, M. Deru, and D. Porta. Integrating a multitouch kiosk system with mobile devices and multimodal interaction. In Proc. of ITS '10, pages 245--246, New York, NY, USA, 2010. ACM. Google ScholarDigital Library
- J. P. Chin, V. A. Diehl, and K. L. Norman. Development of an instrument measuring user satisfaction of the human-computer interface. In Proc. of CHI '88, pages 213--218, New York, NY, USA, 1988. ACM. Google ScholarDigital Library
- G. Essl, M. Rohs, and S. Kratz. Use the force (or something) - pressure and pressure-like input for mobile music performance. In Proc. of NIME '10, pages 182--185, Sydney, Australia, June 2010.Google Scholar
- P. Fitts. The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology, 47:381--391, 1964.Google ScholarCross Ref
- C. Harrison and S. E. Hudson. Scratch input: creating large, inexpensive, unpowered and mobile finger input surfaces. In Proc. of UIST '08, pages 205--208, New York, NY, USA, 2008. ACM. Google ScholarDigital Library
- C. Harrison, J. Schwarz, and S. E. Hudson. Tapsense: enhancing finger interaction on touch surfaces. In Proc. of UIST '11, pages 627--636, New York, NY, USA, 2011. ACM. Google ScholarDigital Library
- C. Harrison, D. Tan, and D. Morris. Skinput: appropriating the body as an input surface. In Proc. of CHI '10, pages 453--462, New York, NY, USA, 2010. ACM. Google ScholarDigital Library
- S. Heo and G. Lee. Forcetap: extending the input vocabulary of mobile touch screens by adding tap gestures. In Proc. of MobileHCI '11, pages 113--122, New York, NY, USA, 2011. ACM. Google ScholarDigital Library
- K. Hinckley, J. Pierce, M. Sinclair, and E. Horvitz. Sensing techniques for mobile interaction. In Proc. of CHI '00, pages 91--100, New York, NY, USA, 2000. ACM. Google ScholarDigital Library
- K. Hinckley and H. Song. Sensor synaesthesia: touch in motion, and motion in touch. In Proc. of CHI '11, pages 801--810, New York, NY, USA, 2011. ACM. Google ScholarDigital Library
- H. Ishii, C. Wisneski, J. Orbanes, B. Chun, and J. Paradiso. Pingpongplus: design of an athletic-tangible interface for computer-supported cooperative play. In Proc. of CHI '99, pages 394--401, New York, NY, USA, 1999. ACM. Google ScholarDigital Library
- K. Iwasaki, T. Miyaki, and J. Rekimoto. Expressive typing: A new way to sense typing pressure and its applications. In Proc. of CHI EA '09, pages 4369--4374, New York, NY, USA, 2009. ACM. Google ScholarDigital Library
- P. Lopes, R. Jota, and J. A. Jorge. Augmenting touch interaction through acoustic sensing. In Proc. of ITS '11, pages 53--56, New York, NY, USA, 2011. ACM. Google ScholarDigital Library
- J. Paradiso, C. K. Leo, N. Checka, and K. Hsiao. Passive acoustic sensing for tracking knocks atop large interactive displays. In Proc. of SENS '02, pages 521--527. IEEE Computer Society, 2002.Google ScholarCross Ref
- P. Peltonen, E. Kurvinen, A. Salovaara, G. Jacucci, T. Ilmonen, J. Evans, A. Oulasvirta, and P. Saarikko. It's mine, don't touch!: interactions at a large multi-touch display in a city centre. In Proc. of CHI '08, pages 1285--1294, New York, NY, USA, 2008. ACM. Google ScholarDigital Library
- P. Quinn and A. Cockburn. Zoofing!: faster list selections with pressure-zoom-flick-scrolling. In Proc. of OZCHI '09, pages 185--192, New York, NY, USA, 2009. ACM. Google ScholarDigital Library
- G. Ramos and R. Balakrishnan. Zliding: fluid zooming and sliding for high precision parameter manipulation. In Proc. of UIST '05, pages 143--152, New York, NY, USA, 2005. ACM. Google ScholarDigital Library
- G. Ramos, M. Boulos, and R. Balakrishnan. Pressure widgets. In Proc. of CHI '04, CHI '04, pages 487--494, New York, NY, USA, 2004. ACM. Google ScholarDigital Library
- X. Ren, J. Yin, S. Zhao, and Y. Li. The adaptive hybrid cursor: a pressure-based target selection technique for pen-based user interfaces. In Proc. of INTERACT '07, pages 310--323, Berlin, Heidelberg, 2007. Springer-Verlag. Google ScholarDigital Library
- J. Schöning, P. Brandl, F. Daiber, F. Echtler, O. Hilliges, J. Hook, M. Löchtefeld, N. Motamedi, L. Muller, P. Olivier, T. Roth, and U. v. S. Zadow. Multi-touch surfaces: A technical guide. Technical Report TUM-I0833, University of Münster, 2008.Google Scholar
- J. C. Stevens. Scales of apparent force. Journal of experimental psychology, 58(5):405--413, 1959.Google Scholar
- S. S. Stevens. The psychophysics of sensory function. American Scientist, 48(2):pp. 226--253, 1960.Google Scholar
- K.-P. Yee. Two-handed interaction on a tablet display. In Proc. of CHI EA '04, pages 1493--1496, New York, NY, USA, 2004. ACM. Google ScholarDigital Library
Index Terms
- Expressive touch: studying tapping force on tabletops
Recommendations
Pen + touch = new tools
UIST '10: Proceedings of the 23nd annual ACM symposium on User interface software and technologyWe describe techniques for direct pen+touch input. We observe people's manual behaviors with physical paper and notebooks. These serve as the foundation for a prototype Microsoft Surface application, centered on note-taking and scrapbooking of ...
Direct-touch vs. mouse input for tabletop displays
CHI '07: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsWe investigate the differences -- in terms of bothquantitative performance and subjective preference -- between direct-touch and mouse input for unimanual andbimanual tasks on tabletop displays. The results of twoexperiments show that for bimanual tasks ...
One-Handed Interaction Technique for Single-Touch Gesture Input on Large Smartphones
SUI '19: Symposium on Spatial User InteractionWe propose a one-handed interaction technique using cursor based on touch pressure to enable users to perform various single-touch gestures such as a tap, swipe, drag, and double-tap on unreachable targets. In the proposed technique, cursor mode is ...
Comments