skip to main content
research-article

TouchCam: Realtime Recognition of Location-Specific On-Body Gestures to Support Users with Visual Impairments

Published: 08 January 2018 Publication History

Abstract

On-body interaction, which employs the user's own body as an interactive surface, offers several advantages over existing touchscreen devices: always-available control, an expanded input space, and additional proprioceptive and tactile cues that support non-visual use. While past work has explored a variety of approaches such as wearable depth cameras, bio-acoustics, and infrared reflectance (IR) sensors, these systems do not instrument the gesturing finger, do not easily support multiple body locations, and have not been evaluated with visually impaired users (our target). In this paper, we introduce TouchCam, a finger wearable to support location-specific, on-body interaction. TouchCam combines data from infrared sensors, inertial measurement units, and a small camera to classify body locations and gestures using supervised learning. We empirically evaluate TouchCam's performance through a series of offline experiments followed by a realtime interactive user study with 12 blind and visually impaired participants. In our offline experiments, we achieve high accuracy (>96%) at recognizing coarse-grained touch locations (e.g., palm, fingers) and location-specific gestures (e.g., tap on wrist, left swipe on thigh). The follow-up user study validated our real-time system and helped reveal tradeoffs between various on-body interface designs (e.g., accuracy, convenience, social acceptability). Our findings also highlight challenges to robust input sensing for visually impaired users and suggest directions for the design of future on-body interaction systems.

Supplementary Material

stearns (stearns.zip)
Supplemental movie, appendix, image and software files for, TouchCam: Realtime Recognition of Location-Specific On-Body Gestures to Support Users with Visual Impairments

References

[1]
Dustin Adams, Lourdes Morales, and Sri Kurniawan. 2013. A qualitative study to support a blind photography mobile application. In Proc. PETRA 2013: 1--8. https://doi.org/10.1145/2504335.2504360
[2]
Daniel Ashbrook, Patrick Baudisch, and Sean White. 2011. Nenya: Subtle and Eyes-free Mobile Input with a Magnetically-tracked Finger Ring. In Proc. CHI 2011, 2043--2046. https://doi.org/10.1145/1978942.1979238
[3]
John Canny. 1986. A Computational Approach to Edge Detection. IEEE Transactions on Pattern Analysis and Machine Intelligence PAMI-8, 6: 679--698. https://doi.org/10.1109/TPAMI.1986.4767851
[4]
Liwei Chan, Yi-Ling Chen, Chi-Hao Hsieh, Rong-Hao Liang, and Bing-Yu Chen. 2015. CyclopsRing: Enabling Whole-Hand and Context-Aware Interactions Through a Fisheye Ring. In Proc. UIST 2015, 549--556. https://doi.org/10.1145/2807442.2807450
[5]
Liwei Chan, Chi-Hao Hsieh, Yi-Ling Chen, Shuo Yang, Da-Yuan Huang, Rong-Hao Liang, and Bing-Yu Chen. 2015. Cyclops: Wearable and single-piece full-body gesture input devices. In Proc. CHI 2015, 3001--3009. https://doi.org/10.1145/2702123.2702464
[6]
Liwei Chan, Rong-Hao Liang, Ming-Chang Tsai, Kai-Yin Cheng, Chao-Huai Su, Mike Y Chen, Wen-Huang Cheng, and Bing-Yu Chen. 2013. FingerPad: Private and Subtle Interaction Using Fingertips. In Proc. UIST 2013, 255--260. https://doi.org/10.1145/2501988.2502016
[7]
Michał Choraś and Rafał Kozik. 2012. Contactless palmprint and knuckle biometrics for mobile devices. Pattern Anal. Appl. 15, 1: 73-- 85. https://doi.org/10.1007/s10044-011-0248-4
[8]
Dar-Shyang Lee and S.N. Srihari. 1995. A theory of classifier combination: the neural network approach. In Proc. ICDAR 1995, 42--45. https://doi.org/10.1109/ICDAR.1995.598940
[9]
Mohammad Omar Derawi, Bian Yang, and Christoph Busch. 2012. Fingerprint Recognition with Embedded Cameras on Mobile Phones. In Security and Privacy in Mobile Info. and Com. Sys. Springer, 136--147. https://doi.org/10.1007/978-3-642-30244-2_12
[10]
Niloofar Dezfuli, Mohammadreza Khalilbeigi, Jochen Huber, Florian Müller, and Max Mühlhäuser. 2012. PalmRC: Imaginary Palm-Based Remote Control for Eyes-free Television Interaction. In Proc. EuroITV 2012, 27. Retrieved August 13, 2015 from http://dl.acm.org/citation.cfm?id=2325616.2325623
[11]
Murat Ekinci and Murat Aykut. 2008. Palmprint Recognition by Applying Wavelet-Based Kernel PCA. Computer Science and Technology 23, 107: 851--861.
[12]
Eryun Liu, A. K. Jain, and Jie Tian. 2013. A Coarse to Fine Minutiae-Based Latent Palmprint Matching. IEEE Transactions on Pattern Analysis and Machine Intelligence 35, 10: 2307--2322. https://doi.org/10.1109/TPAMI.2013.39
[13]
Leah Findlater, Lee Stearns, Ruofei Du, Uran Oh, David Ross, Rama Chellappa, and Jon Froehlich. 2015. Supporting Everyday Activities for Persons with Visual Impairments Through Computer Vision-Augmented Touch. In Proc. ASSETS 2015, 383--384. https://doi.org/10.1145/2700648.2811381
[14]
Yoav Freund and Robert E. Schapire. 1995. A desicion-theoretic generalization of on-line learning and an application to boosting. In Proc. EuroCOLT 1995. Springer, Berlin, Heidelberg, 23--37. https://doi.org/10.1007/3-540-59119-2_166
[15]
Daniel Goldreich and Ingrid M. Kanics. 2003. Tactile Acuity is Enhanced in Blindness. J. Neurosci. 23, 8: 3439--3445. Retrieved November 24, 2013 from http://www.jneurosci.org/content/23/8/3439.abstract
[16]
Zhenhua Guo, Lei Zhang, and David Zhang. 2010. Rotation invariant texture classification using LBP variance (LBPV) with global matching. Pattern Recognition 43, 3: 706--719. https://doi.org/10.1016/j.patcog.2009.08.017
[17]
Sean Gustafson, Daniel Bierwirth, and Patrick Baudisch. 2010. Imaginary Interfaces: Spatial Interaction with Empty Hands and Without Visual Feedback. In Proc. UIST 2010, 3--12. https://doi.org/10.1145/1866029.1866033
[18]
Sean G Gustafson, Bernhard Rabe, and Patrick M Baudisch. 2013. Understanding Palm-based Imaginary Interfaces: The Role of Visual and Tactile Cues when Browsing. In Proc. CHI 2013, 889--898. https://doi.org/10.1145/2470654.2466114
[19]
Sean Gustafson, Christian Holz, and Patrick Baudisch. 2011. Imaginary Phone: Learning Imaginary Interfaces by Transferring Spatial Memory from a Familiar Device. In Proc. UIST 2011, 283--292. https://doi.org/10.1145/2047196.2047233
[20]
Chris Harrison, Desney Tan, and Dan Morris. 2010. Skinput: Appropriating the Body As an Input Surface. In Proc. CHI 2010, 453--462. https://doi.org/10.1145/1753326.1753394
[21]
Chris Harrison and Andrew D Wilson. 2011. OmniTouch: wearable multitouch interaction everywhere. In Proc. UIST 2011, 441--450.
[22]
Sture Holm. 1979. A simple sequentially rejective multiple test procedure. Scandinavian Journal of Statistics: 65--70.
[23]
De-Shuang Huang, Wei Jia, and David Zhang. 2008. Palmprint verification based on principal lines. Pattern Recognition 41, 4: 1316--1328. https://doi.org/10.1016/j.patcog.2007.08.016
[24]
A.K. Jain and Jianjiang Feng. 2009. Latent Palmprint Matching. IEEE Trans. Pattern Anal. Mach. Intell. 31, 6: 1032--1047. https://doi.org/10.1109/TPAMI.2008.242
[25]
Chajndrika Jayant, Hanjie Ji, Samuel White, and Jeffrey P. Bigham. 2011. Supporting blind photography. In Proc. ASSETS 2011, 203. https://doi.org/10.1145/2049536.2049573
[26]
Wolf Kienzle and Ken Hinckley. 2014. LightRing. In Proc. UIST 2014, 157--160. https://doi.org/10.1145/2642918.2647376
[27]
Gierad Laput, Robert Xiao, Xiang “Anthony” Chen, Scot E. Hudson, and Chris Harrison. 2014. Skin buttons. In Proc. UIST 2014, 389--394. https://doi.org/10.1145/2642918.2647356
[28]
Gierad Laput, Robert Xiao, and Chris Harrison. 2016. ViBand: High-Fidelity Bio-Acoustic Sensing Using Commodity Smartwatch Accelerometers. In Proc. UIST 2016, 321--333. https://doi.org/10.1145/2984511.2984582
[29]
John P. Lewis. 1995. Fast Template Matching. Vision Interface 95, 120123: 15--19.
[30]
Rong-Hao Liang, Shu-Yang Lin, Chao-Huai Su, Kai-Yin Cheng, Bing-Yu Chen, and De-Nian Yang. 2011. SonarWatch: Appropriating the Forearm as a Slider Bar. In SIGGRAPH Asia 2011 Emerging Technologies on - SA ‘11, 1--1. https://doi.org/10.1145/2073370.2073374
[31]
Soo-Chul Lim, Jungsoon Shin, Seung-Chan Kim, and Joonah Park. 2015. Expansion of Smartwatch Touch Interface from Touchscreen to Around Device Interface Using Infrared Line Image Sensors. Sensors 15, 7: 16642--16653. https://doi.org/10.3390/s150716642
[32]
Shu-Yang Lin, Chao-Huai Su, Kai-Yin Cheng, Rong-Hao Liang, Tzu-Hao Kuo, and Bing-Yu Chen. 2011. Pub - Point Upon Body: Exploring Eyes-free Interaction and Methods on an Arm. In Proc. UIST 2011, 481--488. https://doi.org/10.1145/2047196.2047259
[33]
S. O. H. Madgwick, A. J. L. Harrison, and R. Vaidyanathan. 2011. Estimation of IMU and MARG orientation using a gradient descent algorithm. In 2011 IEEE International Conference on Rehabilitation Robotics, 1--7. https://doi.org/10.1109/ICORR.2011.5975346
[34]
Heinrich Braun Martin Riedmiller. 1992. RPROP - A Fast Adaptive Learning Algorithm. In Proc. of ISCIS VII.
[35]
Denys J. C. Mathies, Simon T. Perrault, Bodo Urban, and Shengdong Zhao. 2015. Botential: Localizing On-Body Gestures by Measuring Electrical Signatures on the Human Skin. In Proc, MobileHCI 2015, 207--216. https://doi.org/10.1145/2785830.2785859
[36]
Abdallah Meraoumia, Salim Chitroub, and Ahmed Bouridane. 2011. Fusion of Finger-Knuckle-Print and Palmprint for an Efficient Multi-Biometric System of Person Recognition. In 2011 IEEE International Conference on Communications (ICC), 1--5. https://doi.org/10.1109/icc.2011.5962661
[37]
Aythami Morales, Miguel A. Ferrer, and Ajay Kumar. 2010. Improved palmprint authentication using contactless imaging. In IEEE Conf. on Biometrics: Theory, Applications and Systems (BTAS), 1--6. https://doi.org/10.1109/BTAS.2010.5634472
[38]
Suranga Nanayakkara, Roy Shilkrot, Kian Peen Yeo, and Pattie Maes. 2013. EyeRing: A Finger-worn Input Device for Seamless Interactions with Our Surroundings. In Proc. AH 2013, 13--20. https://doi.org/10.1145/2459236.2459240
[39]
J Farley Norman and Ashley N Bartholomew. 2011. Blindness enhances tactile acuity and haptic 3-D shape discrimination. Attention, Perception, 8 Psychophysics 73, 7: 2323--2331. https://doi.org/10.3758/s13414-011-0160-4
[40]
Masa Ogata and Michita Imai. 2015. SkinWatch. In Proc. AH 2015, 21--24. https://doi.org/10.1145/2735711.2735830
[41]
Masa Ogata, Yuta Sugiura, Yasutoshi Makino, Masahiko Inami, and Michita Imai. 2013. SenSkin: Adapting Skin as a Soft Interface. In Proc. UIST 2013, 539--544. https://doi.org/10.1145/2501988.2502039
[42]
Masa Ogata, Yuta Sugiura, Yasutoshi Makino, Masahiko Inami, and Michita Imai. 2014. Augmenting a Wearable Display with Skin Surface as an Expanded Input Area. In Design, User Experience, and Usability. User Experience Design for Diverse Interaction Platforms and Environments, Aaron Marcus (ed.). Springer International Publishing, Cham, 606--614. https://doi.org/10.1007/978-3-319-07626-3_57
[43]
Masa Ogata, Yuta Sugiura, Hirotaka Osawa, and Michita Imai. 2012. iRing: intelligent ring using infrared reflection. In Proc. UIST 2012, 131--136.
[44]
Uran Oh and Leah Findlater. 2014. Design of and subjective response to on-body input for people with visual impairments. In Proc. ASSETS ‘14, 8 pages.
[45]
Uran Oh and Leah Findlater. 2015. A Performance Comparison of On-Hand versus On-Phone Non-Visual Input by Blind and Sighted Users. ACM Transactions on Accessible Computing (TACCESS) 7, 4: 14.
[46]
Uran Oh, Lee Stearns, Alisha Pradhan, Jon E. Froehlich, and Leah Findlater. 2017. Investigating Microinteractions for People with Visual Impairments and the Potential Role of On-Body Interaction. In Proc. ASSETS 2017, TO APPEAR.
[47]
John C. Plat. 1999. Probabilistic Outputs for Support Vector Machines and Comparisons to Regularized Likelihood Methods. Advances in Large Margin Classifiers 10, 3: 61--74.
[48]
Munehiko Sato, Ivan Poupyrev, and Chris Harrison. 2012. Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects. In Proc. CHI 2012, c: 483--492. https://doi.org/10.1145/2207676.2207743
[49]
Roy Shilkrot, Jochen Huber, Jürgen Steimle, Suranga Nanayakkara, and Pattie Maes. 2015. Digital Digits: A Comprehensive Survey of Finger Augmentation Devices. ACM Computing Surveys 48, 2: 1--29. https://doi.org/10.1145/2828993
[50]
Srinath Sridhar, Anders Markussen, Antti Oulasvirta, Christian Theobalt, and Sebastian Boring. 2017. WatchSense: On- and Above-Skin Input Sensing through a Wearable Depth Sensor. In Proc. CHI 2017, 3891--3902. https://doi.org/10.1145/3025453.3026005
[51]
Lee Stearns, Ruofei Du, Uran Oh, Catherine Jou, Leah Findlater, David A. Ross, and Jon E. Froehlich. 2016. Evaluating Haptic and Auditory Directional Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras. ACM Transactions on Accessible Computing 9, 1: 1--38. https://doi.org/10.1145/2914793
[52]
Lee Stearns, Ruofei Du, Uran Oh, Yumeng Wang, Rama Chellappa, Leah Findlater, and Jon E. Froehlich. 2014. The Design and Preliminary Evaluation of a Finger-Mounted Camera and Feedback System to Enable Reading of Printed Text for the Blind. Workshop on Assistive Computer Vision and Robotics (ACVR‘14) in Conjunction with the European Conference on Computer Vision (ECCV‘14). https://doi.org/10.1007/978-3-319-16199-0_43
[53]
Lee Stearns, Uran Oh, Bridget J. Cheng, Leah Findlater, David Ross, Rama Chellappa, and Jon E. Froehlich. 2016. Localization of skin features on the hand and wrist from small image patches. In Proc. ICPR 2016, 1003--1010. https://doi.org/10.1109/ICPR.2016.7899767
[54]
Emi Tamaki, Takashi Miyaki, and Jun Rekimoto. 2009. Brainy Hand: an earworn hand gesture interaction device. In Proc. CHI EA 2009, 4255. https://doi.org/10.1145/1520340.1520649
[55]
Marynel Vázquez and Aaron Steinfeld. 2012. Helping visually impaired users properly aim a camera. In Proc. ASSETS 2012, 95. https://doi.org/10.1145/2384916.2384934
[56]
Wai Kin Kong and D. Zhang. 2002. Palmprint texture analysis based on low-resolution images for personal authentication. In Proc. Pattern Recognition ‘02, 807--810. https://doi.org/10.1109/ICPR.2002.1048142
[57]
Cheng-Yao Wang, Min-Chieh Hsiu, Po-Tsung Chiu, Chiao-Hui Chang, Liwei Chan, Bing-Yu Chen, and Mike Y. Chen. 2015. PalmGesture: Using Palms as Gesture Interfaces for Eyes-free Input. In Proc. MobileHCI 2015, 217--226. https://doi.org/10.1145/2785830.2785885
[58]
Martin Weigel, Tong Lu, Gilles Bailly, Anti Oulasvirta, Carmel Majidi, and Jürgen Steimle. 2015. iSkin: Flexible, Stretchable and Visually Customizable On-Body Touch Sensors for Mobile Computing. In Proc. CHI 2015, 2991--3000. https://doi.org/10.1145/2702123.2702391
[59]
Jiahui Wu, Gang Pan, Daqing Zhang, Guande Qi, and Shijian Li. 2009. Gesture Recognition with a 3-D Accelerometer. In Ubiquitous intelligence and computing (UIC 2009), 25--38. https://doi.org/10.1007/978-3-642-02830-4_4
[60]
Xiangqian Wu, Qiushi Zhao, and Wei Bu. 2014. A SIFT-based contactless palmprint verification approach using iterative RANSAC and local palmprint descriptors. Pattern Recognition 47, 10: 3314--3326. https://doi.org/10.1016/j.patcog.2014.04.008
[61]
Xing-Dong Yang, Tovi Grossman, Daniel Wigdor, and George Fitzmaurice. 2012. Magic finger: always-available input through finger instrumentation. In Proc. UIST 2012, 147--156. https://doi.org/10.1145/2380116.2380137
[62]
Yang Zhang, Junhan Zhou, Gierad Laput, and Chris Harrison. 2016. SkinTrack: Using the Body as an Electrical Waveguide for Continuous Finger Tracking on the Sking. In Proc. CHI 2016, 1491--1503. https://doi.org/10.1145/2858036.2858082
[63]
Yang Zhang, Junhan Zhou, Gierad Laput, and Chris Harrison. 2016. SkinTrack: Using the Body as an Electrical Waveguide for Continuous Finger Tracking on the Skin. In Proc. CHI 2016, 1491--1503. https://doi.org/10.1145/2858036.2858082
[64]
Zhi Li, Guizhong Liu, Yang Yang, and Junyong You. 2012. Scale- and Rotation-Invariant Local Binary Pattern Using Scale-Adaptive Texton and Subuniform-Based Circular Shift. IEEE Transactions on Image Processing 21, 4: 2130--2140. https://doi.org/10.1109/TIP.2011.2173697

Cited By

View all
  • (2024)SonicVista: Towards Creating Awareness of Distant Scenes through SonificationProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36596098:2(1-32)Online publication date: 15-May-2024
  • (2024)EgoTouch: On-Body Touch Input Using AR/VR Headset CamerasProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676455(1-11)Online publication date: 13-Oct-2024
  • (2024)IRIS: Wireless ring for vision-based smart home interactionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676327(1-16)Online publication date: 13-Oct-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies  Volume 1, Issue 4
December 2017
1298 pages
EISSN:2474-9567
DOI:10.1145/3178157
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 January 2018
Accepted: 01 October 2017
Revised: 01 August 2017
Received: 01 May 2017
Published in IMWUT Volume 1, Issue 4

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Accessibility
  2. Blind and Low-Vision Users
  3. Computer Vision Applications
  4. Gesture Recognition
  5. On-body input
  6. Skin Texture Classification
  7. Wearable sensors

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

  • Assistant Secretary of Defense for Health Affairs

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)47
  • Downloads (Last 6 weeks)9
Reflects downloads up to 03 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)SonicVista: Towards Creating Awareness of Distant Scenes through SonificationProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36596098:2(1-32)Online publication date: 15-May-2024
  • (2024)EgoTouch: On-Body Touch Input Using AR/VR Headset CamerasProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676455(1-11)Online publication date: 13-Oct-2024
  • (2024)IRIS: Wireless ring for vision-based smart home interactionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676327(1-16)Online publication date: 13-Oct-2024
  • (2023)MicroCamProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36109217:3(1-28)Online publication date: 27-Sep-2023
  • (2023)OptiRing: Low-Resolution Optical Sensing for Subtle Thumb-to-Index Micro-InteractionsProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614538(1-13)Online publication date: 13-Oct-2023
  • (2023)“I Could Wear It All of the Time, Just Like My Wedding Ring:” Insights into Older People’s Perceptions of Smart RingsExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585771(1-8)Online publication date: 19-Apr-2023
  • (2023)New Insights into User-Defined Smart Ring Gestures with Implications for Gesture Elicitation StudiesExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585590(1-8)Online publication date: 19-Apr-2023
  • (2023)Understanding Wheelchair Users’ Preferences for On-Body, In-Air, and On-Wheelchair GesturesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580929(1-16)Online publication date: 19-Apr-2023
  • (2023)iFAD Gestures: Understanding Users’ Gesture Input Performance with Index-Finger Augmentation DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580928(1-17)Online publication date: 19-Apr-2023
  • (2023)PianoWatch: An Intelligent Piano Understanding and Evaluation System Using Smartwatch2023 20th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON)10.1109/SECON58729.2023.10287516(222-230)Online publication date: 11-Sep-2023
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media