Skip to main content

Detection of Finger Contact with Skin Based on Shadows and Texture Around Fingertips

  • Conference paper
  • First Online:
Human-Computer Interaction. Interaction Techniques and Novel Applications (HCII 2021)

Abstract

This paper proposes a method to detect contact between fingers and skin based on shadows and texture around fingertips. An RGB camera installed on a head-mounted display can use the proposed method to detect finger contact with the body. The processing pipeline of the method consists of extraction of fingertip image, image enhancement, and contact detection using machine learning. A fingertip image is extracted from a hand image to limit image features to those around fingertips. Image enhancement reduces the influence of different lighting environments. A contact detection utilizes deep learning models to achieve high accuracy. Datasets of fingertip images are built from videos recording where a user touches and releases the forearm with his/her fingers. An experiment is conducted to evaluate the proposed method in terms of image enhancement methods and data augmentation methods. Results of the experiment show that the proposed method has a maximum accuracy of 97.6% in cross-validation. The results also show that the proposed method is more robust to different users than different lighting environments.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Lee, Y., Kim, G.J.: Vitty: virtual touch typing interface with added finger buttons. In: Lackey, S., Chen, J. (eds.) VAMR 2017. LNCS, vol. 10280, pp. 111–119. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-57987-0_9

    Chapter  Google Scholar 

  2. Ravasz, J.: Keyboard input for virtual reality. https://uxdesign.cc/keyboard-input-for-virtual-reality-d551a29c53e9 (2017)

  3. Microsoft HoloLens. https://docs.microsoft.com/en-gb/hololens/hololens2-basic-usage (2019)

  4. Irie, H., et al.: AirTarget: a highly-portable markerless user interface using optical see-through HMD. IPSJ Journal 55(4), 1415–1427 (2014)

    Google Scholar 

  5. Komiya, K., Nakajima, T.: A Japanese input method using leap motion in virtual reality. In: 2017 Tenth International Conference on Mobile Computing and Ubiquitous Network (2017)

    Google Scholar 

  6. Bowman, D.A., Rhoton, C.J., Pinho, M.S.: Text input techniques for immersive virtual environments: an empirical comparison. In: Human Factors and Ergonomics Society Annual Meeting Proceedings, vol. 46, no. 26, pp. 2154–2158 (2002)

    Google Scholar 

  7. Whitmire, E., et al.: DigiTouch: reconfigurable thumb-to-finger input and text entry on head-mounted displays. In: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 1, no. 3, Article no. 113 (2017)

    Google Scholar 

  8. Grubert, J., Witzani, L., Ofek, E., Pahud, M., Kranz, M., Kristensson, P.O.: Text entry in immersive head-mounted display-based virtual reality using standard keyboards. In: 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (2018)

    Google Scholar 

  9. Walker, J., Li, B., Vertanen, K., Kuhl, S.: Efficient typing on a visually occluded physical keyboard. In: Proceedings of the ACM Conference on Human Factors in Computing Systems, pp. 5457–5461 (2017)

    Google Scholar 

  10. Bovet, S., et al.: Using traditional keyboards in VR: SteamVR developer kit and pilot game user study. In: 2018 IEEE Games, Entertainment, Media Conference, pp. 132–135 (2018)

    Google Scholar 

  11. Kawaguchi, K., Isomoto, T., Shizuki, B., Takahashi, S.: Flick-based Japanese text entry method on palm for virtual reality. In: Proceedings of the Human Interface Symposium (2019)

    Google Scholar 

  12. Harrison, C., Benko, H., Wilson, A.D.: OmniTouch: wearable multitouch interaction everywhere. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, pp. 441–450 (2011)

    Google Scholar 

  13. Zhang, F., et al.: MediaPipe hands: on-device real-time hand tracking. arXiv:2006.10214 (2020)

  14. Kouchi, M.: 2012 AIST Japanese hand dimensions data (2012). https://www.airc.aist.go.jp/dhrt/hand/index.html. (in Japanese)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yuto Sekiya .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Sekiya, Y., Umezawa, T., Osawa, N. (2021). Detection of Finger Contact with Skin Based on Shadows and Texture Around Fingertips. In: Kurosu, M. (eds) Human-Computer Interaction. Interaction Techniques and Novel Applications. HCII 2021. Lecture Notes in Computer Science(), vol 12763. Springer, Cham. https://doi.org/10.1007/978-3-030-78465-2_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-78465-2_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-78464-5

  • Online ISBN: 978-3-030-78465-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics