skip to main content
10.1145/2556288.2557239acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

More than touch: understanding how people use skin as an input surface for mobile computing

Published: 26 April 2014 Publication History

Abstract

This paper contributes results from an empirical study of on-skin input, an emerging technique for controlling mobile devices. Skin is fundamentally different from off-body touch surfaces, opening up a new and largely unexplored interaction space. We investigate characteristics of the various skin-specific input modalities, analyze what kinds of gestures are performed on skin, and study what are preferred input locations. Our main findings show that (1) users intuitively leverage the properties of skin for a wide range of more expressive commands than on conventional touch surfaces; (2) established multi-touch gestures can be transferred to on-skin input; (3) physically uncomfortable modalities are deliberately used for irreversible commands and expressing negative emotions; and (4) the forearm and the hand are the most preferred locations on the upper limb for on-skin input. We detail on users' mental models and contribute a first consolidated set of on-skin gestures. Our findings provide guidance for developers of future sensors as well as for designers of future applications of on-skin input.

Supplementary Material

MP4 File (p179-sidebyside.mp4)

References

[1]
Agache, P. G., Monneur, C., Leveque, J. L., and Rigal, J. Mechanical properties and young's modulus of human skin in vivo. Archives of Dermatological Research, Volume 269, 221--232. Springer, 1980.
[2]
Benford, S., Greenhalgh, C., Giannachi, G., Walker, B., Marshall, J., and Rodden, T. Uncomfortable interactions. In Proc CHI'12, ACM Press (2012), 2005--2014.
[3]
Dezfuli, N., Khalilbeigi, M., Huber, J., Müller, F., and Mühlhäuser, M. PalmRC. In Proc. EuroiTV '12, ACM Press (2012), 27--34.
[4]
Glaser, B. and Strauss, A. The Discovery of Grounded Theory: Strategies for Qualitative Research. Transaction Publishers. 1967.
[5]
Grunwald, M. Human Haptic Perception. Basics and Applications. Birkhäuser, Berlin 2008.
[6]
Gustafson, S., Holz, C., and Baudisch, P. Imaginary phone. In Proc UIST '11, ACM Press (2010), 283--292.
[7]
Gustafson, S., Rabe, B., and Baudisch, P. M. Understanding palm-based imaginary interfaces. In Proc. CHI '13, ACM Press (2013), 889--898.
[8]
Harrison, C., Benko, H., and Wilson, A. D. OmniTouch: wearable multitouch interaction everywhere. In Proc. UIST '11, ACM Press (2011), 441--450.
[9]
Harrison, C., Ramamurthy, S., and Hudson, S. E. Onbody interaction: armed and dangerous. In Proc. TEI '12, ACM Press (2012), 69--76.
[10]
Harrison, C., Tan, D., and Morris, D. Skinput: appropriating the body as an input surface. In Proc. CHI'10, ACM Press (2010), 453--462.
[11]
Hendriks, F. M. Mechanical behaviour of human epidermal and dermal layers in vivo. Eindhoven: Technische Universiteit Eindhoven. 2005.
[12]
Hertenstein, M. J., Holmes, R., McCullough, M., and Keltner, D. In The communication of emotion via touch. In Emotion'09, American Psychological Association (2009), 566--573.
[13]
Hertenstein, M. J., Keltner, D., App, B., Bulleit, B. A., and Jaskolka, A. R. Touch communicates distinct emotions. In Emotion'06, American Psychological Association (2009), 528--533.
[14]
Holz, C., Grossman, T., Fitzmaurice, G., and Agur, A. Implanted user interfaces. In Proc. CHI '12, ACM Press (2012), 503--512.
[15]
Jones, S. E. and Yarbrough, A. E. A naturalistic study of the meaning of touch. In Communication Monographs Vol. 52, 1985.
[16]
Karrer, T., Wittenhagen, M., Lichtschlag, L., Heller, F., and Borchers, J. Pinstripe: eyes-free continuous input on interactive clothing. In Proc. CHI '11, ACM Press (2011), 1313--1322.
[17]
Kim, D., Hilliges, O., Izadi, S., et al. Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor. In UIST '12, ACM Press (2012), 167--176.
[18]
Lee, S.-S., Kim, S., Jin, B., et al. How users manipulate deformable displays as input devices. In Proc. CHI '10, ACM Press (2010), 1647--1656.
[19]
Lin, S.-Y., Su, C.-H., Cheng, K.-Y., Liang, R.-H., Kuo, T.-H., and Chen, B.-Y. Pub - point upon body: exploring eyes-free interaction and methods on an arm. In Proc. UIST '11, ACM Press (2011), 481--488.
[20]
Lumelsky, V., Shur, M., and Wagner, S. Sensitive Skin. Selected Topics in Electronics and Systems: Volume 18.
[21]
Mistry, P., Maes, P., and Chang, L. WUW - wear Ur world. Ext. Abstracts CHI'09, ACM Press (2009), 4111--4116.
[22]
Mujibiya, A., Cao, X., Tan, D. S., Morris, D., Patel, S. N., and Rekimoto, J. The sound of touch: on-body touch and gesture sensing based on transdermal ultrasound propagation. In Proc. ITS '13. ACM Press (2013), 189--198.
[23]
Ogata, M., Sugiura, Y., Makino, Y., Inami, M., and Imai, M. SenSkin: Adapting Skin as a Soft Interface. In Proc. UIST'13, ACM Press (2013), 539--544.
[24]
Olberding, S., Yeo, K. P., Nanayakkara, S., and Steimle, J. AugmentedForearm: exploring the design space of a display-enhanced forearm. In Proc. AH '13, ACM Press (2013), 9--12.
[25]
Picard, R. Affective Computing. MIT Press. 1997.
[26]
Rekimoto, J. GestureWrist and GesturePad: unobtrusive wearable interaction devices. In Proc. ISCV'01, IEEE (2001), 21--27.
[27]
Saponas, T. S., Tan, D. S., Morris, D., Balakrishnan, R., Turner, J., and Landay, J. A. Enabling always-available input with muscle-computer interfaces. In Proc. UIST'09, ACM Press (2009), 167--176.
[28]
Schacter, D. Psychology Ed. 2. Worth Publishers (2010).
[29]
Tamaki, E., Miyaki, T., and Rekimoto, J. Brainy hand: an ear-worn hand gesture interaction device. Ext. Abstracts CHI'09, ACM Press (2009), 4255--4260.
[30]
Wagner, J., Nancel, M., Gustafson, S. G., Huot, S., and Mackay, W. E. Body-centric design space for multi-surface interaction. In Proc. CHI'13, ACM Press (2013), 1299--1308.
[31]
Wobbrock, J. O., Morris, M. R., and Wilson, A. D. Userdefined gestures for surface computing. In Proc. CHI'09, ACM Press (2009), 1083--1092.

Cited By

View all
  • (2024)Exploring User-Defined Gestures as Input for Hearables and Recognizing Ear-Level Gestures with IMUsProceedings of the ACM on Human-Computer Interaction10.1145/36765038:MHCI(1-23)Online publication date: 24-Sep-2024
  • (2024)Designing with Discomfort: A Feminist Approach towards Intimate Care TechnologiesCompanion Publication of the 2024 ACM Designing Interactive Systems Conference10.1145/3656156.3665134(30-36)Online publication date: 1-Jul-2024
  • (2024)EgoTouch: On-Body Touch Input Using AR/VR Headset CamerasProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676455(1-11)Online publication date: 13-Oct-2024
  • Show More Cited By

Index Terms

  1. More than touch: understanding how people use skin as an input surface for mobile computing

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '14: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
    April 2014
    4206 pages
    ISBN:9781450324731
    DOI:10.1145/2556288
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 April 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. deformable surface
    2. elicitation study
    3. mobile computing
    4. on-skin input
    5. skin gestures
    6. touch input

    Qualifiers

    • Research-article

    Conference

    CHI '14
    Sponsor:
    CHI '14: CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2014
    Ontario, Toronto, Canada

    Acceptance Rates

    CHI '14 Paper Acceptance Rate 465 of 2,043 submissions, 23%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)113
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 17 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Exploring User-Defined Gestures as Input for Hearables and Recognizing Ear-Level Gestures with IMUsProceedings of the ACM on Human-Computer Interaction10.1145/36765038:MHCI(1-23)Online publication date: 24-Sep-2024
    • (2024)Designing with Discomfort: A Feminist Approach towards Intimate Care TechnologiesCompanion Publication of the 2024 ACM Designing Interactive Systems Conference10.1145/3656156.3665134(30-36)Online publication date: 1-Jul-2024
    • (2024)EgoTouch: On-Body Touch Input Using AR/VR Headset CamerasProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676455(1-11)Online publication date: 13-Oct-2024
    • (2024)Understanding Gesture and Microgesture Inputs for Augmented Reality MapsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661630(409-423)Online publication date: 1-Jul-2024
    • (2024)GraspUI: Seamlessly Integrating Object-Centric Gestures within the Seven Phases of GraspingProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661551(1275-1289)Online publication date: 1-Jul-2024
    • (2024)Exploring the Somatic Possibilities of Shape Changing Car SeatsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661518(3354-3371)Online publication date: 1-Jul-2024
    • (2024)RadarHand: A Wrist-Worn Radar for On-Skin Touch-Based Proprioceptive GesturesACM Transactions on Computer-Human Interaction10.1145/361736531:2(1-36)Online publication date: 29-Jan-2024
    • (2024)HSM:Hand Surface Menu Solution for Interaction in VR2024 10th International Conference on Virtual Reality (ICVR)10.1109/ICVR62393.2024.10869121(167-176)Online publication date: 24-Jul-2024
    • (2024)PalmSpaceInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103219184:COnline publication date: 1-Apr-2024
    • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media