Skip to main content

Dynamic Hand Gesture Recognition for Numeral Handwritten via A-Mode Ultrasound

  • Conference paper
  • First Online:
Intelligent Robotics and Applications (ICIRA 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13456))

Included in the following conference series:

  • 2262 Accesses

Abstract

In recent years, due to the defects of weak sEMG signal, insensitive to fine finger movement and serious impression by noise, researchers consider the need to use A-mode ultrasound (AUS) for gesture decoding. However, the current A-mode ultrasonic gesture recognition algorithm is still relatively basic, which can recognize the recognition function of discrete gestures. However, due to the lack of time information, A-mode ultrasound still lacks an algorithm to recognize the dynamic gesture process. Therefore, we design and experiment a deep learning algorithm model applied to AUS signal, which is a deep learning framework based on LSTM. Due to the principle of LSTM, the model sets a certain number of frames as the whole action process, and constructs the connection of each frame in the whole process, so the time correlation (time characteristic) of AUS signal is constructed. Then, the features from AUS signal are sent to the complete full connection layer to output the classification results. And because AUS signal lacks data set of dynamic gestures, we designed and tested handwritten digits 0–9 as an example of dynamic gestures. Experimental results show that this algorithm can realize the dynamic gesture classification of AUS signal and solve the defect of AUS signal lacking time information. In addition, compared with the experimental action of traditional methods, it gives the practical significance of dynamic gesture in life, which is closer to life.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zhao, H., Zhao, C.: Gesture recognition smart home. J. Phys. Conf. Ser. 1570, 012045 (2020). https://doi.org/10.1088/1742-6596/1570/1/012045. (7 pp.)

  2. Khalaf, A.S., Alharthi, S.A., Dolgov, I., Toups, Z.O.: A comparative study of hand gesture recognition devices in the context of game design. In: 14th ACM International Conference on Interactive Surfaces and Spaces (ISS), Conference Proceedings, pp. 397–402 (2019). https://doi.org/10.1145/3343055.3360758

  3. Wen, F., et al.: Machine learning glove using self-powered conductive superhydrophobic triboelectric textile for gesture recognition in VR/AR applications. Adv. Sci. 7(14) (2020). https://doi.org/10.1002/advs.202000261

  4. Chen, H., Zhang, Y., Li, G., Fang, Y., Liu, H.: Surface electromyography feature extraction via convolutional neural network. Int. J. Mach. Learn. Cybern. 11(1), 185–196 (2019). https://doi.org/10.1007/s13042-019-00966-x

  5. Hu, Y., Wong, Y., Wei, W., Du, Y., Kankanhalli, M., Geng, W.: A novel attention-based hybrid CNN-RNN architecture for sEMG-based gesture recognition. Plos One 13(10) (2018). https://doi.org/10.1371/journal.pone.0206049

  6. Supratak, A., Dong, H., Wu, C., Guo, Y.: DeepSleepNet: a model for automatic sleep stage scoring based on raw single-channel EEG. IEEE Trans. Neural Syst. Rehabil. Eng. 25(11), 1998–2008 (2017). https://doi.org/10.1109/tnsre.2017.2721116

  7. Faust, O., Hagiwara, Y., Hong, T.J., Lih, O.S., Acharya, U.R.: Deep learning for healthcare applications based on physiological signals: a review. Comput. Methods Programs Biomed. 161, 1–13 (2018). https://doi.org/10.1016/j.cmpb.2018.04.005

  8. Ferrari, M., Muthalib, M., Quaresima, V.: The use of near infrared spectroscopy in understanding skeletal muscle physiology: recent developments. Philos. Trans. R. Soc. a-Math. Phys. Eng. Sci. 369(1955), 4577–4590 (2011). https://doi.org/10.1098/rsta.2011.0230

  9. Xiao, Z.G., Menon, C.: A review of force myography research and development. Sensors 19(20) (2019). https://doi.org/10.3390/s19204557

  10. Siddiqui, N., Chan, R.H.M.: Multimodal hand gesture recognition using single IMU and acoustic measurements at wrist. Plos One 15(1), 12 (2020). https://doi.org/10.1371/journal.pone.0227039

  11. Yang, X., Zhou, D., Zhou, Y., et al.: Towards zero re-training for long-term hand gesture recognition via ultrasound sensing. Inst. Electr. Electron. Eng. Inc. (4) (2019). https://doi.org/10.1109/JBHI.2018.2867539

  12. Phukpattaranont, P., Thongpanja, S., Anam, K., Al-Jumaily, A., Limsakul, C.: Evaluation of feature extraction techniques and classifiers for finger movement recognition using surface electromyography signal. Med. Biol. Eng. Comput. 56(12), 2259–2271 (2018). https://doi.org/10.1007/s11517-018-1857-5

  13. Xia, W., Zhou, Y., Yang, X.C., He, K.S., Liu, H.H.: Toward portable hybrid surface electromyography/a-mode ultrasound sensing for human-machine interface. IEEE Sens. J. 19(13), 5219–5228 (2019). https://doi.org/10.1109/Jsen.2019.2903532

  14. Dong, J., Zhang, Y., Zhang, H., Jia, Z., Zhang, S., Wang, X.: Comparison of axial length, anterior chamber depth and intraocular lens power between IOLMaster and ultrasound in normal, long and short eyes. Plos One 13(3) (2018). https://doi.org/10.1371/journal.pone.0194273

  15. Li, Y.F., He, K.S., Sun, X.L., Liu, H.H.: Human-machine interface based on multi-channel single-element ultrasound transducers: a preliminary study. In: 18th IEEE International Conference on e Health Networking, Applications and Services (Healthcom), Conference Proceedings, New York, pp. 360–365. IEEE (2016)

    Google Scholar 

  16. Yang, X., Zhou, Y., Liu, H.: Wearable ultrasound-based decoding of simultaneous wrist/hand kinematics. IEEE Trans. Ind. Electron. 68(9), 8667–8675 (2021). https://doi.org/10.1109/tie.2020.3020037

  17. He, J., Luo, H., Jia, J., Yeow, J.T.W., Jiang, N.: Wrist and finger gesture recognition with single-element ultrasound signals: a comparison with single-channel surface electromyogram. IEEE Trans. Biomed. Eng. 66(5), 1277–1284 (2019). https://doi.org/10.1109/tbme.2018.2872593

  18. Allard, U.C., et al.: A convolutional neural network for robotic arm guidance using sEMG based frequency-features. In: 2016 IEEE/Rsj International Conference on Intelligent Robots and Systems (IROS 2016), pp. 2464–2470 (2016)

    Google Scholar 

  19. Sun, X., Yang, X., Zhu, X., et al.: Dual-frequency ultrasound transducers for the detection of morphological changes of deep-layered muscles. IEEE Sens. J. 18(4), 1373–1383 (2017)

    Google Scholar 

  20. Zeng, J., Zhou, Y., Yang, Y., et al.: Feature fusion of sEMG and ultrasound signals in hand gesture recognition. In: 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE (2020)

    Google Scholar 

  21. Yu, Y., Si, X., Hu, C., et al.: A review of recurrent neural networks: LSTM cells and network architectures. Neural Comput. 31(7), 1235–1270 (2019)

    Google Scholar 

  22. Li, Y., He, K., Sun, X., et al. Human-machine interface based on multi-channel single-element ultrasound transducers: a preliminary study. In: 2016 IEEE 18th International Conference on e-Health Networking, Applications and Services (Healthcom). [S.l.]: [s.n.], pp. 1–6 (2016)

    Google Scholar 

  23. Zhu, G.M., et al.: Redundancy and attention in convolutional LSTM for gesture recognition. IEEE Trans. Neural Netw. Learn. Syst. 31(4), 1323–1335 (2020). https://doi.org/10.1109/Tnnls.2019.2919764

  24. Tong, R.Z., Zhang, Y., Chen, H.F., Liu, H.H.: Learn the temporal-spatial feature of sEMG via dual-flow network. Int. J. Humanoid Robot. 16(4) (2019). https://doi.org/10.1142/S0219843619410044. DOI Artn 1941004

  25. Ma, C.F., Lin, C., Samuel, O.W., Xu, L.S., Li, G.L.: Continuous estimation of upper limb joint angle from sEMG signals based on SCA-LSTM deep learning approach. Biomed. Signal Process. Control 61 (2020). https://doi.org/10.1016/j.bspc.2020.102024. DOI ARTN 102024

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Honghai Liu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, D., Zhang, D., Liu, H. (2022). Dynamic Hand Gesture Recognition for Numeral Handwritten via A-Mode Ultrasound. In: Liu, H., et al. Intelligent Robotics and Applications. ICIRA 2022. Lecture Notes in Computer Science(), vol 13456. Springer, Cham. https://doi.org/10.1007/978-3-031-13822-5_55

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-13822-5_55

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-13821-8

  • Online ISBN: 978-3-031-13822-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics