skip to main content
10.1145/2047196.2047233acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device

Published: 16 October 2011 Publication History

Abstract

We propose a method for learning how to use an imaginary interface (i.e., a spatial non-visual interface) that we call "transfer learning". By using a physical device (e.g. an iPhone) a user inadvertently learns the interface and can then transfer that knowledge to an imaginary interface. We illustrate this concept with our Imaginary Phone prototype. With it users interact by mimicking the use of a physical iPhone by tapping and sliding on their empty non-dominant hand without visual feedback. Pointing on the hand is tracked using a depth camera and touch events are sent wirelessly to an actual iPhone, where they invoke the corresponding actions. Our prototype allows the user to perform everyday task such as picking up a phone call or launching the timer app and setting an alarm. Imaginary Phone thereby serves as a shortcut that frees users from the necessity of retrieving the actual physical device. We present two user studies that validate the three assumptions underlying the transfer learning method. (1) Users build up spatial memory automatically while using a physical device: participants knew the correct location of 68% of their own iPhone home screen apps by heart. (2) Spatial memory transfers from a physical to an imaginary inter-face: participants recalled 61% of their home screen apps when recalling app location on the palm of their hand. (3) Palm interaction is precise enough to operate a typical mobile phone: Participants could reliably acquire 0.95cm wide iPhone targets on their palm-sufficiently large to operate any iPhone standard widget.

Supplementary Material

JPG File (fp192.jpg)
MOV File (fp192.mov)

References

[1]
Apple VoiceOver, http://www.apple.com/accessibility/iphone/vision.htm.
[2]
Ashbrook, D.L., Clawson, J.R., Lyons, K., Starner, T.E. and Patel, N. Quickdraw: the impact of mobility and on-body placement on device access time. In Proc. CHI, (2008), 219--222.
[3]
Ashbrook, D. Enabling Mobile Microinteractions. PhD Thesis. (2010).
[4]
Greiner, T.M. Hand Anthropometry of U.S. Army Personnel. U.S. Army Report ADA244533, (1991).
[5]
Gustafson, S. Bierwirth, D. and Baudisch, P. Imaginary Interfaces: spatial interaction with empty hands and without visual feedback. In Proc. UIST, (2010), 3--12.
[6]
Harrison, C., Tan, D. and Morris, D. Skinput: appropriating the body as an input surface. In Proc. CHI, (2010), 453--462.
[7]
Harrison, C. and Hudson, S.E. Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices. In Proc. UIST, (2009), 121--124.
[8]
Holz, C. and Baudisch, P. The generalized perceived input point model and how to double touch accuracy by extracting fingerprints. In Proc. CHI, (2010), 581--59.
[9]
Holz, C. and Wilson, A.D. Data Miming: inferring spatial object descriptions from human gesture. In Proc. CHI, (2011), 2501--251.
[10]
Kohli, L. and Whitton, M. The Haptic Hand: providing user interface feedback with the non-dominant hand in virtual environments. In Proc. GI, (2005), 1--8.
[11]
1Kuester, F., Chen, M., Phair, M.E. and Mehring, C. Towards keyboard independent touch typing in VR. In Proc. VRST, (2005), 86--95.
[12]
Li, F.C.Y., Dearman, D. and Truong, K.N. Virtual Shelves: interactions with orientation aware devices. In Proc. UIST, (2009), 125--128.
[13]
Loclair, C., Gustafson, S. and Baudisch, P. PinchWatch: a wearable device for one-handed microinteractions. In Proc. MobileHCI Workshop on Ensembles of On-Body Devices, (2010), 4 pages.
[14]
Microsoft Kinect, http://www.xbox.com/kinect.
[15]
Mistry, P., Maes, P., and Chang, L. WUW - wear Ur world: a wearable gestural interface. In CHI Extended Abstracts, (2009), 4111--4116.
[16]
Mistry, P. and Maes, P. Mouseless. In Adjunct Proc. UIST, (2010), 441--442.
[17]
Oulasvirta, A., Tamminen, S., Roto, V. and Kuorelahti, J. Interaction in 4-second bursts: the fragmented nature of attentional resources in mobile HCI. In Proc. CHI, (2005), 919--92.
[18]
Perugini, S., Anderson, T.J. and Moroney, W.F. A study of out-of-turn interaction in menu-based, IVR, voicemail systems. In Proc. CHI, (2007), 961--970.
[19]
Rekimoto, J. GestureWrist and GesturePad: unobtrusive wearable interaction devices. In Proc. ISWC, (2001), 21--27.
[20]
Strachan, S., Murray-Smith, R. and O'Modhrain, S. BodySpace: inferring body pose for natural control of a music player. In CHI Extended Abstracts, (2007), 2001--2006.
[21]
2Tamaki, E., Miyaki, T. and Rekimoto, J. Brainy Hand: an ear-worn hand gesture interaction device. In CHI Extended Abstracts, (2009), 4255--4260.
[22]
2Vogel, D. and Baudisch, P. Shift: a technique for operating pen-based interfaces using touch. In Proc. CHI, (2007), 657--666.
[23]
Wachs, J.P., Kölsch, M., Stern, H. and Edan, Y. Vision-based hand-gesture applications. CACM 54, 2 (2011), 60--71.
[24]
Wang, F. and Ren, X. Empirical evaluation for finger input properties in multi-touch interaction. In Proc. CHI, (2009), 1063--1072.
[25]
Willis, K.D., Lin, J., Mitani, J. Igarashi, T. Spatial Sketch: bridging between movement & fabrication. In Proc. TEI, (2010), 5--12.
[26]
Yin, M. and Zhai, S. The benefits of augmenting telephone voice menu navigation with visual browsing and search. In Proc. CHI, (2006), 319--328.

Cited By

View all
  • (2024)EgoTouch: On-Body Touch Input Using AR/VR Headset CamerasProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676455(1-11)Online publication date: 13-Oct-2024
  • (2024)TouchInsight: Uncertainty-aware Rapid Touch and Text Input for Mixed Reality from Egocentric VisionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676330(1-16)Online publication date: 13-Oct-2024
  • (2024)AI Art for Self-Interest or Common Good? Uncovering Value Tensions in Artists’ Imaginaries of AI TechnologiesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661619(2897-2910)Online publication date: 1-Jul-2024
  • Show More Cited By

Index Terms

  1. Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '11: Proceedings of the 24th annual ACM symposium on User interface software and technology
    October 2011
    654 pages
    ISBN:9781450307161
    DOI:10.1145/2047196
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 16 October 2011

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. imaginary interface
    2. memory
    3. mobile
    4. non-visual
    5. screen-less
    6. spatial memory
    7. touch
    8. wearable

    Qualifiers

    • Research-article

    Conference

    UIST '11

    Acceptance Rates

    UIST '11 Paper Acceptance Rate 67 of 262 submissions, 26%;
    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Upcoming Conference

    UIST '25
    The 38th Annual ACM Symposium on User Interface Software and Technology
    September 28 - October 1, 2025
    Busan , Republic of Korea

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)83
    • Downloads (Last 6 weeks)8
    Reflects downloads up to 17 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)EgoTouch: On-Body Touch Input Using AR/VR Headset CamerasProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676455(1-11)Online publication date: 13-Oct-2024
    • (2024)TouchInsight: Uncertainty-aware Rapid Touch and Text Input for Mixed Reality from Egocentric VisionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676330(1-16)Online publication date: 13-Oct-2024
    • (2024)AI Art for Self-Interest or Common Good? Uncovering Value Tensions in Artists’ Imaginaries of AI TechnologiesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661619(2897-2910)Online publication date: 1-Jul-2024
    • (2024)RadarHand: A Wrist-Worn Radar for On-Skin Touch-Based Proprioceptive GesturesACM Transactions on Computer-Human Interaction10.1145/361736531:2(1-36)Online publication date: 29-Jan-2024
    • (2024)MAF: Exploring Mobile Acoustic Field for Hand-to-Face Gesture InteractionsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642437(1-20)Online publication date: 11-May-2024
    • (2024)HSM:Hand Surface Menu Solution for Interaction in VR2024 10th International Conference on Virtual Reality (ICVR)10.1109/ICVR62393.2024.10869121(167-176)Online publication date: 24-Jul-2024
    • (2024)PalmSpaceInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103219184:COnline publication date: 1-Apr-2024
    • (2023)Commonsense Knowledge-Driven Joint Reasoning Approach for Object Retrieval in Virtual RealityACM Transactions on Graphics10.1145/361832042:6(1-18)Online publication date: 5-Dec-2023
    • (2023)Surveying the Social Comfort of Body, Device, and Environment-Based Augmented Reality Interactions in Confined Passenger Spaces Using Mixed Reality Composite VideosProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36109237:3(1-25)Online publication date: 27-Sep-2023
    • (2023)LapTouchProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36108787:3(1-23)Online publication date: 27-Sep-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media