Abstract
Sign language is a means of communication between individuals, whether they are hard of hearing or other people who do not speak the language of the host country. While it is a means of communication, few people know it and it has no universal patterns. For example, there is a set of signs unique to each country, resulting from the customs and traditions of each country. Currently, much research is being done in this area to solve the problems of sign language translation using computer vision and artificial intelligence.
The importance of this topic is due to the possibility of using Deep Convolutional Neural Networks (CNNs), embedded in deep learning technology, to recognize hand gestures in real time and to translate sign language.
This article presents two new datasets containing 54,049 of images of the ArSL-2018 (Arabic Sign Language) alphabet taken by more than 40 people, as well as 15,200 images in our personal datasets, categorized into 32 classes of standard Arabic characters, and classified, normalized, and detected using a VGGNet model and ResNet50. In this study, the success of two different training and testing exercises performed without fine-tuning and with fine-tuning enhancers was compared.
The optimized weights of each VGGNet layer were achieved as the network was pre-trained on two large datasets of alphabet sign language. In addition, the ResNet50 classifier was trained using 40 epochs and fine-tuned with 40 plus epochs to ensure optimal classification performance. The high accuracy levels achieved in comparison with other studies support the effectiveness of this approach. Specifically, the ArSL alphabets dataset achieved accuracies of 99,05%, 99,99%, and 98,50%, using VGG16, VGG19, and ResNet50 Models respectively, demonstrating the effectiveness of the proposed method for hand gesture recognition tasks.
Similar content being viewed by others
References
Wen, F., Zhang, Z., He, T., et al.: AI enabled sign language recognition and VR space bidirectional communication using triboelectric smart glove. Nat. Commun. 12(5378) (2021). https://doi.org/10.1038/s41467-021-25637-w
Wang, Y., Tang, T., Xu, Y., et al.: All-weather, natural silent speech recognition via machine-learning-assisted tattoo-like electronics. NPJ Flex Electron. 5(20) (2021). https://doi.org/10.1038/s41528-021-00119-7
Sun, Z., Zhu, M., Shan, X., et al.: Augmented tactile-perception and haptic-feedback rings as human-machine interfaces aiming for immersive interactions. Nat. Commun. 13(5224) (2022). https://doi.org/10.1038/s41467-022-32745-8
An, S., Zhu, H., Guo, C., et al.: Noncontact human-machine interaction based on hand-responsive infrared structural color. Nat. Commun. 13(1446) (2022). https://doi.org/10.1038/s41467-022-29197-5
Ghazanfar, L., Nazeeruddin, M., Jaafar, A., AlKhalaf, R., AlKhalaf, R.: ArASL: Arabic alphabets sign language dataset. Data Brief 23(103777) (2019). https://doi.org/10.1016/j.dib.2019.103777
Herbaz, N., Elidrissi, H., Badri, A.: A Moroccan sign language recognition algorithm using a convolution neural network. J. ICT Stand. 3(10) (2022). https://doi.org/10.13052/jicts2245-800X.1033
Ghazanfar, L., Nazeeruddin, M., Jaafar, A., AlKhalaf, R., AlKhalaf, R., Khan, M.A.: An automatic arabic sign language recognition system based on deep CNN: an assistive system for the deaf and hard of hearing. Int. J. Comput. Digit. Syst. 4(90) (2020). https://doi.org/10.12785/ijcds/090418
Adeyanju, I.A., Bello, O.O., Azeez, M.A.: Development of an American sign language recognition system using canny edge and histogram of oriented gradient. Niger. J. Technol. Dev. 3(19) (2022). https://doi.org/10.4314/njtd.v19i3.2
Alani, A.A., Cosma G.: ArSL-CNN a convolutional neural network for Arabic sign language gesture recognition. Indones. J. Electr. Eng. Comput. Sci. 2(22) (2021). https://doi.org/10.11591/ijeecs.v22i2.pp1096-1107
Ewe, E.L.R., Lee, C.P., Kwek, L.C., Lim, K.M.: Hand gesture recognition via lightweight VGG16 and ensemble classifier. Appl. Sci. 12(15) (2022). https://doi.org/10.3390/app12157643
Ashish, S., Anmol, M., Savitoj, S., Vasudev, A.: Hand gesture recognition using image processing and feature extraction techniques. Procedia Comput. Sci. 173, 181–190 (2020). https://doi.org/10.1016/j.procs.2020.06.022
Gesture recognition in human-robot interaction: an overview. https://cnii-jest.ru/ru/science/publikatsii/53-raspoznavanie-zhestov-pri-vzaimodejstvii-cheloveka-i-robota-obzor. Accessed 25 Oct 2022
Gnanapriya, S., Rahimunnisa, K.: A hybrid deep learning model for real time hand gestures recognition. Intell. Autom. Soft Comput. 36(1) (2023). https://doi.org/10.32604/iasc.2023.032832
Li, J., Li, C., Han, J., Shi, Y., Bian, G., Zhou, S.: Robust hand gesture recognition using HOG-9ULBP features and SVM model. Electronics 11(988) (2022). https://doi.org/10.3390/electronics11070988
Wang, W., He, M., Wang, X., Ma, J., Song, H.: Medical gesture recognition method based on improved lightweight network. Appl. Sci. 12(13) (2022). https://doi.org/10.3390/app12136414
Barczak, A.L.C., Reyes, N.H., Abastillas, M., Piccio, A., Susnjak, T.: A new 2D static hand gesture colour image dataset for ASL gestures. Res. Lett. Inf. Math. Sci. 15, 12–20 (2011)
Barbhuiya, A.A., Karsh, R.K., Jain, R.: CNN based feature extraction and classification for sign language. Multimed. Tools Appl. 80(2), 3051–3069 (2021). https://doi.org/10.1007/s11042-020-09829-y
Khari, M., Kumar Garg, A., González Crespo, R., Verdú, E.: Gesture recognition of RGB and RGB-D static images using convolutional neural networks. Int. J. Interact. Multimed. Artif. Intell. (2019). https://doi.org/10.9781/ijimai.2019.09.002
Zihan, N., Nong, S., Cheng, T.: Deep learning based hand gesture recognition in complex scenes. Pattern Recognit. Comput. Vis. (2018). https://doi.org/10.1117/12.2284977
Hand Gesture Image Dataset. https://universe.roboflow.com/hand-crpit/hand-gesture-hmohr/dataset/1. Accessed 08 Apr 2023
Alnuaim, A., Zakariah, M., Wesam, A.H., Tarazi, H., Tripathi, V., Enoch, T.A.: Human-computer interaction with hand gesture recognition using ResNet and MobileNet. Comput. Intell. Neurosci. (2022). https://doi.org/10.1155/2022/8777355
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Herbaz, N., El Idrissi, H., Badri, A. (2024). A Novel Approach for Recognition and Classification of Hand Gesture Using Deep Convolution Neural Networks. In: Bennour, A., Bouridane, A., Chaari, L. (eds) Intelligent Systems and Pattern Recognition. ISPR 2023. Communications in Computer and Information Science, vol 1940. Springer, Cham. https://doi.org/10.1007/978-3-031-46335-8_8
Download citation
DOI: https://doi.org/10.1007/978-3-031-46335-8_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-46334-1
Online ISBN: 978-3-031-46335-8
eBook Packages: Computer ScienceComputer Science (R0)