Abstract
The abstract should briefly summarize the contents of the paper in Sign language is the form of communication between the deaf and hearing population, which uses the gesture-spatial configuration of the hands as a communication channel with their social environment. This work proposes the development of a gesture recognition method associated with sign language from the processing of time series from the spatial position of hand reference points granted by a Leap Motion optical sensor. A methodology applied to a validated American Sign Language (ASL) Dataset which involves the following sections: (i) preprocessing for filtering null frames, (ii) segmentation of relevant information, (iii) time-frequency characterization from the Discrete Wavelet Transform (DWT). Subsequently, the classification is carried out with Machine Learning algorithms (iv). It is graded by a 97.96% rating yield using the proposed methodology with the Fast Tree algorithm.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
World Health Organization. Deafness and hearing loss. https://www.who.int/news-room/fact-sheets/detail/deafness-and-hearing-loss. Accessed 27 May 2021
Al-Hammadi, M., et al.: Deep learning-based approach for sign language gesture recognition with efficient hand gesture representation. IEEE Access 8, 192527–192542 (2020). https://doi.org/10.1109/ACCESS.2020.3032140
Cheok, M.J., Omar, Z., Jaward, M.H.: A review of hand gesture and sign language recognition techniques. Int. J. Mach. Learn. Cybern. 10(1), 131–153 (2017). https://doi.org/10.1007/s13042-017-0705-5
Zafrulla, Z., Brashear, H., Starner, T., Hamilton, H., Presti, P.: American sign language recognition with the kinect. In: Proceedings of the 13th International Conference on Multimodal Interfaces, pp. 279–286, November 2011. https://doi.org/10.1145/2070481.2070532
Chong, T.W., Lee, B.G.: American sign language recognition using leap motion controller with a machine learning approach. Sensors 18(10), 3554 (2018). https://doi.org/10.3390/s18103554
Weichert, F., Bachmann, D., Rudak, B., Fisseler, D.: Analysis of the accuracy and robustness of the leap motion controller. Sensors 13(5), 6380–6393 (2013). https://doi.org/10.3390/s130506380
Lei, L., Dashun, Q.: Design of data-glove and Chinese sign language recognition system based on ARM9. In: 2015 12th IEEE International Conference on Electronic Measurement & Instruments (ICEMI), vol. 3, pp. 1130–1134. IEEE, July 2015. https://doi.org/10.1109/ICEMI.2015.7494440
Marin, G., Dominio, F., Zanuttigh, P.: Hand gesture recognition with leap motion and kinect devices. In: 2014 IEEE International Conference on Image Processing (ICIP), pp. 1565–1569. IEEE, October 2014. https://doi.org/10.1109/ICIP.2014.7025313
Shin, H., Kim, W.J., Jang, K.A.: Korean sign language recognition based on image and convolution neural network. In: Proceedings of the 2nd International Conference on Image and Graphics Processing, pp. 52–55, February 2019. https://doi.org/10.1145/3313950.3313967
Weerasekera, C.S., Jaward, M.H., Kamrani, N.: Robust asl fingerspelling recognition using local binary patterns and geometric features. In: 2013 International Conference on Digital Image Computing: Techniques and Applications (DICTA), pp. 1–8. IEEE, November 2013. https://doi.org/10.1109/DICTA.2013.6691521
Ravi, S., Suman, M., Kishore, P.V.V., Kumar, E.K., Kumar, M.T.K., et al.: Multi modal spatio temporal cotrained CNNs with single modal testing on RGB-D based sign language gesture recognition. J. Comput. Lang. 52, 88–102 (2019). https://doi.org/10.1016/j.cola.2019.04.002
Su, Y., Qing, Z.: Continuous Chinese sign language recognition with CNN-LSTM. In: Proceedings of SPIE 10420, Ninth International Conference on Digital Image Processing (ICDIP 2017), 104200F, 21 July 2017. https://doi.org/10.1117/12.2281671
Hernandez, V., Suzuki, T., Venture, G.: Convolutional and recurrent neural network for human activity recognition: Application on American sign language. PLoS ONE 15(2), e0228869 (2020). https://doi.org/10.1371/journal.pone.0228869
Shanmuganathan, V., Yesudhas, H.R., Khan, M.S., Khari, M., Gandomi, A.H.: R-CNN and wavelet feature extraction for hand gesture recognition with EMG signals. Neural Comput. Appl. 32(21), 16723–16736 (2020). https://doi.org/10.1007/s00521-020-05349-w
Hernandez, V., Suzuki, T., Venture, G.: American Sign Language classification - LeapMotion - 25 subjects - 60 signs. Mendeley Data. V1 (2018). https://doi.org/10.17632/8yyp7gbg6z.1
Vysocký, A., Grushko, S., Oščádal, P., Kot, T.; Babjak, J., Jánoš, R., Sukop, M., Bobovský, Z.: Analysis of precision and stability of hand tracking with leap motion sensor. Sensors 2020, 20, 4088 (2020) https://doi.org/10.3390/s20154088
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
López-Albán, D., López-Barrera, A., Mayorca-Torres, D., Peluffo-Ordóñez, D. (2021). Sign Language Recognition Using Leap Motion Based on Time-Frequency Characterization and Conventional Machine Learning Techniques. In: Florez, H., Pollo-Cattaneo, M.F. (eds) Applied Informatics. ICAI 2021. Communications in Computer and Information Science, vol 1455. Springer, Cham. https://doi.org/10.1007/978-3-030-89654-6_5
Download citation
DOI: https://doi.org/10.1007/978-3-030-89654-6_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-89653-9
Online ISBN: 978-3-030-89654-6
eBook Packages: Computer ScienceComputer Science (R0)