Skip to main content

Deep Learning Based Gesture Classification for Hand Physical Therapy Interactive Program

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12198))

Abstract

In this paper, we propose using the Google Colab deep learning framework to create and train convolutional neural networks from scratch. The trained network is part of a core artificial intelligent feature of our interactive software game, aiming to encourage white-collar workers to exercise hands and wrists frequently through playing the game. At this moment, the network is trained with our self-collected dataset of 12,000 bare-hand gesture images shot against a static dark background. The network focuses on classifying a still image into one of the six predefined classes of gestures and it seems to cope well with slight variation in size, skin tone, position and orientation of hand. This network is designed to be light in computation with real-time running time even on CPU. The network yields 99.68% accuracy on the validation set and 78% average accuracy when being tested with 50 different users. Our experiment on actual users reveals useful insight about problems using a deep learning based classifier in a real-time interactive system.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Chen, Z., Kim, J., Liang, J., Zhang, J., Yuan, Y.: Real-time hand gesture recognition using finger segmentation. Sci. World J. (2014)

    Google Scholar 

  2. Glauser, O., Wu, S., Panozzo, D., Hilliges, O., Sorkine-Hornung, O.: Interactive hand pose estimation using a stretch-sensing soft glove. ACM Trans. Graph. (TOG 2019) 38(4), 41:1–41:15 (2019)

    Google Scholar 

  3. Guo, J., Cheng, J., Pang, J., Guo, Y.: Real-time hand detection based on multi-stage HOG-SVM classifier. In: Proceedings of the IEEE International Conference on Image Processing (ICIP 2013) (2013)

    Google Scholar 

  4. Kerdvibulvech, C.: Hand tracking by extending distance transform and hand model in real-time. Pattern Recogn. Image Anal. (PRIA 2015) 25(3), 437–441 (2015)

    Article  Google Scholar 

  5. Lien, J., et al.: Soli: ubiquitous gesture sensing with millimeter wave radar. ACM Trans. Graph. (TOG 2016) 35(4), 142:1–142:19 (2016)

    Google Scholar 

  6. The National Institute of Neurological Disorders and Stroke: Carpal tunnel syndrome fact sheet. https://www.ninds.nih.gov/Disorders/Patient-Caregiver-Education/Fact-Sheets/Carpal-Tunnel-Syndrome-Fact-Sheet. Accessed 1 Dec 2019

  7. Phoojaroenchachai, R.: Physical therapy for everyone: Hand and finger exercise. https://www.si.mahidol.ac.th/th/division/hph/admin/news_files/627_49_1.pdf. Accessed 28 Aug 2019. (Thai language)

  8. Ren, Z., Yuan, J., Zhang, Z.: Robust hand gesture recognition based on finger-earth mover’s distance with a commodity depth camera. In: Proceedings of the ACM International Conference on Multimedia (MM 2011), pp. 1093–1096 (2011)

    Google Scholar 

  9. Tan, C.: Real-time finger detection. https://becominghuman.ai/real-time-finger-detection-1e18fea0d1d4. Accessed 28 Aug 2019

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Thitirat Siriborvornratanakul .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Rungruanganukul, M., Siriborvornratanakul, T. (2020). Deep Learning Based Gesture Classification for Hand Physical Therapy Interactive Program. In: Duffy, V. (eds) Digital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management. Posture, Motion and Health. HCII 2020. Lecture Notes in Computer Science(), vol 12198. Springer, Cham. https://doi.org/10.1007/978-3-030-49904-4_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-49904-4_26

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-49903-7

  • Online ISBN: 978-3-030-49904-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics