skip to main content
10.1145/2148131.2148155acmconferencesArticle/Chapter ViewAbstractPublication PagesteiConference Proceedingsconference-collections
research-article

PinchPad: performance of touch-based gestures while grasping devices

Published: 19 February 2012 Publication History

Abstract

This paper focuses on combining front and back device interaction on grasped devices, using touch-based gestures. We designed generic interactions for discrete, continuous, and combined gesture commands that are executed without hand-eye control because the performing fingers are hidden behind a grasped device. We designed the interactions in such a way that the thumb can always be used as a proprioceptive reference for guiding finger movements, applying embodied knowledge about body structure. In a user study, we tested these touch-based interactions for their performance and users' task-load perception. We combined two iPads together back-to-back to form a double-sided touch screen device: the PinchPad. We discuss the main errors that led to a decrease in accuracy, identify stable features that reduce the error rate, and discuss the role of 'body schema' in designing gesture-based interactions where the user cannot see their hands properly.

References

[1]
Ashbrook, D., Baudisch, P., White, S. 2011. Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring, In Proc. CHI 2011, 2043--2046.
[2]
Döring, T. et al. Gestural interaction on the steering wheel: reducing the visual demand, In Proc. CHI 2011.
[3]
Essl, G., Rohs, M., Kratz, S. 2009. Squeezing the Sandwich: A Mobile Pressure-Sensitive Two-Sided Multi-Touch Prototype, In Proc. UIST 2009, Demo.
[4]
Fukumoto, M. and Tonomura, Y. 1997. 'Body coupled FingerRing': wireless wearable keyboard, In Proc. CHI 1997, 147--154.
[5]
Graziano, M. S. A., Botvinick, M. M. 2002. How the brain represents the body: Insights from neurophysiology and psychology (2002) Common Mechanisms in Perception and Action: Attention and Performance XIX, 145--152.
[6]
Hart, S. G., Staveland, L. E. 1988. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. In P. A. Hancock and N. Meshkati (Eds.) Human Mental Workload. Amsterdam: North Holland Press.
[7]
Holman, D., Vertegaal, R. 2008. Organic user interfaces: designing computers in any way, shape, or form. Commun. ACM 51, 6 (June 2008), 48--55.
[8]
Holman, D., Vertegaal, R., Troje N. 2005 PaperWindows: Interaction Techniques for Digital Paper. In Proc. CHI 2005, 591--599.
[9]
Holz, C. and Baudisch, P. 2011. Understanding Touch, In Proc. CHI 2011, 2501--2510.
[10]
Istance, H., Bates, R., Hyrskykari., Vickers, S. 2008. Snap clutch, a moded approach to solving the Midas touch problem. In Proc. Symposium on Eye tracking research 2008, 221--228.
[11]
Karlson, A., Bederson, B. 2007. ThumbSpace: generalized one-handed input for touchscreen-based mobile devices, In Proc. INTERACT 2007, 324--338.
[12]
Kim, K. et al. 2006. Hand Grip Pattern Recognition for Mobile User Interfaces, American Association for Artificial Intelligence 2006, 1789--1794.
[13]
Lashley, K. S. 1917. The accuracy of movement in the absence of excitation from the moving organ. The American Journal of Physiology (1917), 43, 169--194.
[14]
Maravata, A., Spence, C., and Driver, J. 2003. Multisensory Integration and the Body Schema: Close to Hand and Within Reach, Current Biology, Vol. 13, July 1, 2003, 531--539.
[15]
Schwesig, C., Poupyrev, I., Mori, E. 2004. Gummi: a bendable computer, In Proc. CHI 2004, 263--270.
[16]
Shen, E. E., Tsai, S. D., Chu, H., Hsu, Y. J., Chen, C. E. 2009. Double-side multi-touch input for mobile devices. In Proc. CHI 2009, 4339--4344.
[17]
Spalteholz, W., Spanner, R. 1960. Handatlas der Anatomie des Menschen -- Erster Teil: Bewegungsapparat, Amsterdam 1960.
[18]
Sundström, P., Taylor, A. S., Grufberg, K., Wirström, N., Belenguer, J. S. 2011. Inspirational Bits. Towards a shared understanding of the digital material. In Proc. CHI 2011, 1561--1570.
[19]
Tsukada, K. and Yasumura, M. 2002. Ubi-Finger: Gesture Input Device for Mobile Use, In Proc. APCHI 2002, Vol. 1, 388--400.
[20]
van Beers, R. J., Sittig, A. C., Denier, van der Gon, J. J. 1999. Integration of proprioceptive and visual position-information: an experimental supported model, J. Neurophysiol. 81, 1355--1365.
[21]
van Beers, R. J., Wolpert, D. M., Haggart, P. 2002. When Feeling Is More Important Than Seeing in Sensory Adaption, Current Biology, Vol. 12, May 14, 2002, 834--837.
[22]
Vogel, D., Baudisch, P. 2007. Shift: A Technique for Operating Pen-Based Interfaces Using Touch, In Proc. CHI 2007, 657--666.
[23]
Welch, R. B., Warren, D. H. 1986, Intersensory interactions. In Handbook of Perception and Human Performance, Volume 1. Boff, K. R., Kaufmann, L., Thomas, J. P. eds., New York: Wiley, 25.1--25.36.
[24]
Wigdor, D., Forlines, C., Baudisch, P., Barnwell, J., Shen, C. 2007. LucidTouch: A See-Through Mobile Device, In Proc. UIST 2007, 269--278.
[25]
Wigdor, D., Leigh1, D., Forlines, C., Samuel Shipman, S., John Barnwell, J., Balakrishnan, R., Shen, C. Under the table interaction. In Proc. UIST 2006, 259--268.
[26]
Wimmer, R. 2011. Grasp Sensing for Human-Computer Interaction, In Proc. TEI 2011, 221--228.
[27]
Wimmer, R., Boring, S. HandSense - Discriminating Different Ways of Grasping and Holding a Tangible User Interface, In Proc. TEI 2009, 359--362.
[28]
Wobbrock, J. O. et al. The performance of hand postures in front-and back-of-device interaction for mobile computing, International Journal of Human-Computer Studies, Volume 66, Issue 12, December 2008, 857--875.
[29]
Wolf, K., Naumann, A., Rohs, M., and Mueller, J. 2011. A Taxonomy of Microinteractions: Defining Microgestures based on Ergonomic and Scenario-dependent Requirements, In Proc. INTERACT 2011, 559--575.

Cited By

View all
  • (2023)AdHocProx: Sensing Mobile, Ad-Hoc Collaborative Device Formations using Dual Ultra-Wideband RadiosProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581300(1-18)Online publication date: 19-Apr-2023
  • (2022)Squeezy-Feely: Investigating Lateral Thumb-Index Pinching as an Input ModalityProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501981(1-15)Online publication date: 29-Apr-2022
  • (2022)Weaving Fire into FormundefinedOnline publication date: 20-Jul-2022
  • Show More Cited By

Index Terms

  1. PinchPad: performance of touch-based gestures while grasping devices

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    TEI '12: Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction
    February 2012
    413 pages
    ISBN:9781450311748
    DOI:10.1145/2148131
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 19 February 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. body schema
    2. gesture
    3. grasp
    4. mobile devices
    5. offset
    6. pinch

    Qualifiers

    • Research-article

    Conference

    TEI'12
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 393 of 1,367 submissions, 29%

    Upcoming Conference

    TEI '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)19
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 25 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)AdHocProx: Sensing Mobile, Ad-Hoc Collaborative Device Formations using Dual Ultra-Wideband RadiosProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581300(1-18)Online publication date: 19-Apr-2023
    • (2022)Squeezy-Feely: Investigating Lateral Thumb-Index Pinching as an Input ModalityProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501981(1-15)Online publication date: 29-Apr-2022
    • (2022)Weaving Fire into FormundefinedOnline publication date: 20-Jul-2022
    • (2018)InfiniTouchProceedings of the 31st Annual ACM Symposium on User Interface Software and Technology10.1145/3242587.3242605(779-792)Online publication date: 11-Oct-2018
    • (2017)GazeGripProceedings of the 29th Australian Conference on Computer-Human Interaction10.1145/3152771.3156159(467-471)Online publication date: 28-Nov-2017
    • (2017)DuopographySIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications10.1145/3132787.3139197(1-6)Online publication date: 27-Nov-2017
    • (2017)A smartphone prototype for touch interaction on the whole device surfaceProceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services10.1145/3098279.3122143(1-8)Online publication date: 4-Sep-2017
    • (2017)PocketThumbProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/30900551:2(1-17)Online publication date: 30-Jun-2017
    • (2017)BackXPressProceedings of the 2017 CHI Conference on Human Factors in Computing Systems10.1145/3025453.3025565(4654-4666)Online publication date: 2-May-2017
    • (2017)My Tablet Is Moving Around, Can I Touch It?Proceedings of the Eleventh International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3024969.3025082(495-502)Online publication date: 20-Mar-2017
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media