skip to main content
10.1145/2807442.2807450acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

CyclopsRing: Enabling Whole-Hand and Context-Aware Interactions Through a Fisheye Ring

Published: 05 November 2015 Publication History

Abstract

This paper presents CyclopsRing, a ring-style fisheye imaging wearable device that can be worn on hand webbings to en- able whole-hand and context-aware interactions. Observing from a central position of the hand through a fisheye perspective, CyclopsRing sees not only the operating hand, but also the environmental contexts that involve with the hand-based interactions. Since CyclopsRing is a finger-worn device, it also allows users to fully preserve skin feedback of the hands. This paper demonstrates a proof-of-concept device, reports the performance in hand-gesture recognition using random decision forest (RDF) method, and, upon the gesture recognizer, presents a set of interaction techniques including on-finger pinch-and-slide input, in-air pinch-and-motion input, palm-writing input, and their interactions with the environ- mental contexts. The experiment obtained an 84.75% recognition rate of hand gesture input from a database of seven hand gestures collected from 15 participants. To our knowledge, CyclopsRing is the first ring-wearable device that supports whole-hand and context-aware interactions.

Supplementary Material

MP4 File (p549.mp4)

References

[1]
Bailly, G., Müller, J., Rohs, M., Wigdor, D., and Kratz, S. ShoeSense: A new perspective on gestural interaction and wearable applications. In Proc. ACM CHI '12 (2012), 1239--1248.
[2]
Bay, H., Tuytelaars, T., and Van Gool, L. Surf: Speeded up robust features. In Computer Vision ECCV 2006, vol. 3951 of Lecture Notes in Computer Science. 2006, 404--417.
[3]
Chan, L., Hsieh, C.-H., Chen, Y.-L., Yang, S., Huang, D.-Y., Liang, R.-H., and Chen, B.-Y. Cyclops: Wearable and single-piece full-body gesture input devices. In Proc. ACM CHI '15 (2015), 3001--3009.
[4]
Chan, L., Liang, R.-H., Tsai, M.-C., Cheng, K.-Y., Su, C.-H., Chen, M. Y., Cheng, W.-H., and Chen, B.-Y. FingerPad: Private and subtle interaction using fingertips. In Proc. ACM UIST '13 (2013), 255--260.
[5]
Chen, K.-Y., Lyons, K., White, S., and Patel, S. uTrack: 3D input using two magnetic sensors. In Proc. ACM UIST '13 (2013), 237--244.
[6]
Colaco, A., Kirmani, A., Yang, H. S., Gong, N.-W., Schmandt, C., and Goyal, V. K. Mime: Compact, low power 3D gesture sensing for interaction with head mounted displays. In Proc. ACM UIST '13 (2013), 227--236.
[7]
Criminisi, A., and Shotton, J. Decision Forests for Computer Vision and Medical Image Analysis. Springer, 2013.
[8]
Dementyev, A., and Paradiso, J. A. WristFlex: Low-power gesture input with wrist-worn pressure sensors. In Proc. ACM UIST '14 (2014), 161--166.
[9]
Fiala, M. ARTag, a fiducial marker system using digital techniques. In Proc. IEEE CVPR '05, vol. 2 (June 2005), 590--596 vol. 2.
[10]
Fukui, R., Watanabe, M., Gyota, T., Shimosaka, M., and Sato, T. Hand shape classification with a wrist contour sensor: Development of a prototype device. In Proc. UbiComp '11 (2011), 311--314.
[11]
Gustafson, S., Bierwirth, D., and Baudisch, P. Imaginary interfaces: Spatial interaction with empty hands and without visual feedback. In Proc. ACM UIST '10 (2010), 3--12.
[12]
Harrison, C., Benko, H., and Wilson, A. D. OmniTouch: Wearable multitouch interaction everywhere. In Proc. ACM UIST '11 (2011), 441--450.
[13]
Harrison, C., and Hudson, S. E. Abracadabra: Wireless, high-precision, and unpowered finger input for very small mobile devices. In Proc. ACM UIST '09 (2009), 121--124.
[14]
Harrison, C., Tan, D., and Morris, D. Skinput: Appropriating the body as an input surface. In Proc. ACM CHI '10 (2010), 453--462.
[15]
Jing, L., Cheng, Z., Zhou, Y., Wang, J., and Huang, T. Magic ring: A self-contained gesture input device on finger. In Proc. ACM MUM '13 (2013), 39:1--39:4.
[16]
Kienzle, W., and Hinckley, K. Lightring: Always-available 2D input on any surface. In Proc. ACM UIST '14 (2014), 157--160.
[17]
Kim, D., Hilliges, O., Izadi, S., Butler, A. D., Chen, J., Oikonomidis, I., and Olivier, P. Digits: Freehand 3D interactions anywhere using a wrist-worn gloveless sensor. In Proc. ACM UIST '12 (2012), 167--176.
[18]
Loclair, Christian, G. S., and Baudisch, P. Pinchwatch: A wearable device for one-handed microinteractions.
[19]
Mistry, P., and Maes, P. SixthSense: A wearable gestural interface. In Proc. ACM SIGGRAPH Asia '09 Sketches (2009), 11:1--11:1.
[20]
Mistry, P., Maes, P., and Chang, L. WUW - wear ur world: A wearable gestural interface. In Proc. ACM CHI EA '09 (2009), 4111--4116.
[21]
Mujibiya, A., Cao, X., Tan, D. S., Morris, D., Patel, S. N., and Rekimoto, J. The sound of touch: On-body touch and gesture sensing based on transdermal ultrasound propagation. In Proc. ACM ITS '13 (2013), 189--198.
[22]
Nanayakkara, S., Shilkrot, R., Yeo, K. P., and Maes, P. Eyering: A finger-worn input device for seamless interactions with our surroundings. In Proc. ACM AH '13 (2013), 13--20.
[23]
Ogata, M., Sugiura, Y., Osawa, H., and Imai, M. iRing: Intelligent ring using infrared reflection. In Proc. ACM UIST '12 (2012), 131--136.
[24]
Pratorius, M., Valkov, D., Burgbacher, U., and Hinrichs, K. DigiTap: An eyes-free VR/AR symbolic input device. In Proc. ACM VRST '14 (2014), 9--18.
[25]
Saponas, T. S., Tan, D. S., Morris, D., Balakrishnan, R., Turner, J., and Landay, J. A. Enabling always-available input with muscle-computer interfaces. In Proc. ACM UIST '09 (2009), 167--176.
[26]
Sato, M., Poupyrev, I., and Harrison, C. Touche;: Enhancing touch interaction on humans, screens, liquids, and everyday objects. In Proc. ACM CHI '12 (2012), 483--492.
[27]
Shotton, J., Fitzgibbon, A., Cook, M., Sharp, T., Finocchio, M., Moore, R., Kipman, A., and Blake, A. Real-time human pose recognition in parts from single depth images. In Proc. IEEE CVPR '11 (2011), 1297--1304.
[28]
Song, J., Sörös, G., Pece, F., Fanello, S. R., Izadi, S., Keskin, C., and Hilliges, O. In-air gestures around unmodified mobile devices. In Proc. ACM UIST '14 (2014), 319--329.
[29]
Tamaki, E., Miyaki, T., and Rekimoto, J. Brainy hand: An ear-worn hand gesture interaction device. In Proc. ACM CHI EA '09 (2009), 4255--4260.
[30]
Taylor, S., Keskin, C., Hilliges, O., Izadi, S., and Helmes, J. Type-hover-swipe in 96 bytes: A motion sensing mechanical keyboard. In Proc. ACM CHI '14 (2014), 1695--1704.
[31]
Viola, P., and Jones, M. Rapid object detection using a boosted cascade of simple features. In Proc. IEEE CVPR '01 (2001), 511--518.
[32]
Yang, X.-D., Hasan, K., Bruce, N., and Irani, P. Surround-see: Enabling peripheral vision on smartphones during active use. In Proc. ACM UIST '13 (2013).

Cited By

View all
  • (2025)Attentive Fusion for Efficient Wrist-Worn Gesture Recognition Based on Dual-View CamerasIEEE Sensors Journal10.1109/JSEN.2024.350057625:1(2008-2018)Online publication date: 1-Jan-2025
  • (2025)Microgesture + GraspInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103398195:COnline publication date: 1-Jan-2025
  • (2024)Ring-a-Pose: A Ring for Continuous Hand Pose TrackingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997418:4(1-30)Online publication date: 21-Nov-2024
  • Show More Cited By

Index Terms

  1. CyclopsRing: Enabling Whole-Hand and Context-Aware Interactions Through a Fisheye Ring

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '15: Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology
    November 2015
    686 pages
    ISBN:9781450337793
    DOI:10.1145/2807442
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 05 November 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. finger touch input
    2. hand gesture input
    3. palm touch input
    4. ring
    5. wearable devices
    6. whole-hand interaction

    Qualifiers

    • Research-article

    Conference

    UIST '15

    Acceptance Rates

    UIST '15 Paper Acceptance Rate 70 of 297 submissions, 24%;
    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Upcoming Conference

    UIST '25
    The 38th Annual ACM Symposium on User Interface Software and Technology
    September 28 - October 1, 2025
    Busan , Republic of Korea

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)128
    • Downloads (Last 6 weeks)21
    Reflects downloads up to 10 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)Attentive Fusion for Efficient Wrist-Worn Gesture Recognition Based on Dual-View CamerasIEEE Sensors Journal10.1109/JSEN.2024.350057625:1(2008-2018)Online publication date: 1-Jan-2025
    • (2025)Microgesture + GraspInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103398195:COnline publication date: 1-Jan-2025
    • (2024)Ring-a-Pose: A Ring for Continuous Hand Pose TrackingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997418:4(1-30)Online publication date: 21-Nov-2024
    • (2024)IRIS: Wireless ring for vision-based smart home interactionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676327(1-16)Online publication date: 13-Oct-2024
    • (2024)BodyTouchProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314267:4(1-22)Online publication date: 12-Jan-2024
    • (2023)VibAware: Context-Aware Tap and Swipe Gestures Using Bio-Acoustic SensingProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614544(1-12)Online publication date: 13-Oct-2023
    • (2023)OptiRing: Low-Resolution Optical Sensing for Subtle Thumb-to-Index Micro-InteractionsProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614538(1-13)Online publication date: 13-Oct-2023
    • (2023)Studying the Visual Representation of MicrogesturesProceedings of the ACM on Human-Computer Interaction10.1145/36042727:MHCI(1-36)Online publication date: 13-Sep-2023
    • (2023)TouchLog: Finger Micro Gesture Recognition Using Photo-Reflective SensorsProceedings of the 2023 ACM International Symposium on Wearable Computers10.1145/3594738.3611371(92-97)Online publication date: 8-Oct-2023
    • (2023)STAR: Smartphone-analogous Typing in Augmented RealityProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606803(1-13)Online publication date: 29-Oct-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media