skip to main content
10.1145/2047196.2047259acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

Pub - point upon body: exploring eyes-free interaction and methods on an arm

Published: 16 October 2011 Publication History

Abstract

This paper presents a novel interaction system, PUB (Point Upon Body), to explore eyes-free interaction in a personal space by allowing users tapping on their own arms to be provided with haptic feedback from their skin. Two user studies determine how users can interact precisely with their forearms and how users behave when operating in their arm space. According to those results, normal users can divide their arm space at most into 6 points between their wrists and elbows with iterative practice. Experimental results also indicate that the divided pattern of each user is unique from that of other ones. Based on the design principles from the observations, an interaction system, PUB, is designed to demonstrate how interaction design benefits from those findings. Two scenarios, remote display control and mobile device control, are demonstrated through the UltraSonic device attached on the users' wrists to detect their tapped positions.

Supplementary Material

JPG File (fp369.jpg)
MP4 File (fp369.mp4)

References

[1]
AHMAD, F., AND MUSILEK, P. A keystroke and pointer control input interface for wearable computers. In Proc. IEEE PerCom '06, pp. 2--11.
[2]
ANTONIAC, P., AND PULLI, P. Marisil -mobile user interface framework for virtual enterprise. In Proc. ICE '01, pp. 171--180.
[3]
BALAKRISHNAN, R., AND HINCKLEY, K. The role of kinesthetic reference frames in two-handed input performance. In Proc. ACM UIST '99, pp. 171--178.
[4]
BAUDEL, T., AND BEAUDOUIN-LAFON, M. Charade: remote control of objects using free-hand gestures. CACM 36, 7 (1993), 28--35.
[5]
BOLT, R. A. Put-that-there: Voice and gesture at the graphics interface. In Proc. ACM SIGGRAPH '80, pp. 262--270.
[6]
GUSTAFSON, S., BIERWIRTH, D., AND BAUDISCH, P. Imaginary interfaces: spatial interaction with empty hands and without visual feedback. In Proc. ACM UIST '10, pp. 3--12.
[7]
HARRISON, C., TAN, D., AND MORRIS, D. Skinput: appropriating the body as an input surface. In Proc. ACM CHI '10, pp. 453--462.
[8]
HINCKLEY, K., PAUSCH, R., AND PROFFITT, D. At-tention and visual feedback: the bimanual frame of reference. In Proc. I3D '97, pp. 121--126.
[9]
HUNG, Y.-P., YANG, Y.-S., CHEN, Y.-S., HSIEH, I.B., AND FUH, C.-S. Free-hand pointer by use of an active stereo vision system. In Proc. IEEE ICPR '98, vol. 2, pp. 1244--1246.
[10]
ISHII, K., ZHAO, S., INAMI, M., IGARASHI, T., AND IMAI, M. Designing laser gesture interface for robot control. In Proc. Interact '09, pp. 479--492.
[11]
JOTA, R., NACENTA, M. A., JORGE, J. A., CARPEN-DALE, S., AND GREENBERG, S. A comparison of ray pointing techniques for very large displays. In Proc. GI '10, pp. 269--276.
[12]
KRUEGER, M. W., GIONFRIDDO, T., AND HINRICHSEN, K. VIDEOPLACE: an artificial reality. In Proc. ACM CHI '85, pp. 35--40.
[13]
LI, F. C. Y., DEARMAN, D., AND TRUONG, K. N. Virtual shelves: interactions with orientation aware devices. In Proc. ACM UIST '09, pp. 125--128.
[14]
LONGO, M. R., AND LOURENCO, S. F. On the nature of near space: Effects of tool use and the transition to far space. Neuropsychologia 44, 6 (2006), 977--981.
[15]
MINE, M. R., BROOKS, JR., F. P., AND SEQUIN, C. H. Moving objects in space: exploiting proprioception in virtual-environment interaction. In Proc. ACM SIGGRAPH '97, pp. 19--26.
[16]
MISTRY, P., MAES, P., AND CHANG, L. WUW -wear Ur world: a wearable gestural interface. In Ext. Abs. ACM CHI '09, pp. 4111--4116.
[17]
NICKEL, K., AND STIEFELHAGEN, R. Pointing gesture recognition based on 3D-tracking of face, hands and head orientation. In Proc. ICMI '03, pp. 140--146.
[18]
REKIMOTO, J. GestureWrist and gesturePad: Unobtrusive wearable interaction devices. In Proc. ISWC '01, pp. 21--27.
[19]
RIZZOLATTI, G., SCANDOLARA, C., MATELLI, M., AND GENTILUCCI, M. Afferent properties of periarcuate neurons in macaque monkeys. Behav. Brain Res. 2, 2 (1981), 147--163.
[20]
SAKATA, H., AND KUSUNOKI, M. Organization of space perception: neural representation of three-dimensional space in the posterior parietal cortex. Curr. Opin. Neurobiol. 2, 2 (1992), 170--174.
[21]
SAPONAS, T. S., TAN, D. S., MORRIS, D., BAL-AKRISHNAN, R., TURNER, J., AND LANDAY, J. A. Enabling always-available input with muscle-computer interfaces. In Proc. ACM UIST '09, pp. 167--176.
[22]
SELLEN, A., KURTENBACH, G., AND BUXTON, W. The role of visual and kinesthetic feedback in the prevention of mode errors. In Proc. Interact '90, pp. 667-- 673.
[23]
SHOEMAKER, G., TANG, A., AND BOOTH, K. S. Shadow reaching: a new perspective on interaction for large displays. In Proc. ACM UIST '07, pp. 53--56.
[24]
SHOEMAKER, G., TSUKITANI, T., KITAMURA, Y., AND BOOTH, K. S. Body-centric interaction techniques for very large wall displays. In Proc. NordiCHI '10, pp. 463--472.
[25]
TAMAKI, E., MIYAKI, T., AND REKIMOTO, J. Brainy hand: an ear-worn hand gesture interaction device. In Ext. Abs. ACM CHI '09, pp. 4255--4260.
[26]
TAN, D. S., PAUSCH, R., STEFANUCCI, J. K., AND PROFFITT, D. R. Kinesthetic cues aid spatial memory. In Ext. Abs. ACM CHI '02, pp. 806--807.
[27]
VOGEL, D., AND BALAKRISHNAN, R. Distant freehand pointing and clicking on very large, high resolution displays. In Proc. ACM UIST '05, pp. 33--42.

Cited By

View all
  • (2024)EgoTouch: On-Body Touch Input Using AR/VR Headset CamerasProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676455(1-11)Online publication date: 13-Oct-2024
  • (2024)Understanding Gesture and Microgesture Inputs for Augmented Reality MapsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661630(409-423)Online publication date: 1-Jul-2024
  • (2024)BodyTouchProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314267:4(1-22)Online publication date: 12-Jan-2024
  • Show More Cited By

Index Terms

  1. Pub - point upon body: exploring eyes-free interaction and methods on an arm

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '11: Proceedings of the 24th annual ACM symposium on User interface software and technology
    October 2011
    654 pages
    ISBN:9781450307161
    DOI:10.1145/2047196
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 16 October 2011

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eyes-free interaction
    2. kinetic user interface
    3. natural haptic feedback.

    Qualifiers

    • Research-article

    Conference

    UIST '11

    Acceptance Rates

    UIST '11 Paper Acceptance Rate 67 of 262 submissions, 26%;
    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)47
    • Downloads (Last 6 weeks)6
    Reflects downloads up to 10 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)EgoTouch: On-Body Touch Input Using AR/VR Headset CamerasProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676455(1-11)Online publication date: 13-Oct-2024
    • (2024)Understanding Gesture and Microgesture Inputs for Augmented Reality MapsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661630(409-423)Online publication date: 1-Jul-2024
    • (2024)BodyTouchProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314267:4(1-22)Online publication date: 12-Jan-2024
    • (2024)PneuMa: Designing Pneumatic Bodily Extensions for Supporting Movement in Everyday LifeProceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3623509.3633349(1-16)Online publication date: 11-Feb-2024
    • (2024)RadarHand: A Wrist-Worn Radar for On-Skin Touch-Based Proprioceptive GesturesACM Transactions on Computer-Human Interaction10.1145/361736531:2(1-36)Online publication date: 29-Jan-2024
    • (2023)LapTouchProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36108787:3(1-23)Online publication date: 27-Sep-2023
    • (2023)Augmenting On-Body Touch Input with Tactile Feedback Through Fingernail HapticsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581473(1-13)Online publication date: 19-Apr-2023
    • (2023)The role of interface configuration on performance accuracy in eyes-free touchscreen interactionUniversal Access in the Information Society10.1007/s10209-023-01057-zOnline publication date: 31-Oct-2023
    • (2023)Embodied Interaction on Constrained Interfaces for Augmented RealitySpringer Handbook of Augmented Reality10.1007/978-3-030-67822-7_10(239-271)Online publication date: 1-Jan-2023
    • (2022)Body-Centric NFC: Body-Centric Interaction with NFC Devices Through Near-Field Enabled ClothingProceedings of the 2022 ACM Designing Interactive Systems Conference10.1145/3532106.3534569(1626-1639)Online publication date: 13-Jun-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media