Abstract
A reading assistant system for blind people based on hand gesture recognition is proposed in this paper. This system consists of seven modules: camera input module, page adjustment module, page information retrieval module, hand pose estimation module, hand gesture recognition module, media controller and audio output device. In the page adjustment module, Hough line detection and local OCR (Optical Character Recognition) are used to rectify text orientation. In the hand gesture recognition module, we propose three practical methods: geometry model, heatmap model and keypoint model. Geometry model recognizes different gestures by geometrical characteristics of hand. Heatmap model which is based on image classification algorithm uses CNN (Convolutional Neural Network) to classify various hand gestures. To simplify the networks in heatmap model, we extract 21 keypoints from a hand heatmap and make them a dataset of points coordinates for training classifier. These three methods can get good results of gesture recognition. By recognizing gestures, our designed system can realize perfect reading assistant function.
Supported in part by Shanghai Municipal Commission of Science and Technology (17JC1402900), and in part by the National Natural Science Foundation of China under Grants 61831015.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Abche, A.B., Yaacoub, F., Maalouf, A., Karam, E.: Image registration based on neural network and Fourier transform. In: 2006 International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 4803–4806. IEEE (2006)
Brabyn, J., Crandall, W., Gerrey, W.: Remote reading systems for the blind: a potential application of virtual presence. In: 1992 14th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, vol. 4, pp. 1538–1539. IEEE (1992)
Cao, Z., Hidalgo, G., Simon, T., Wei, S.E., Sheikh, Y.: OpenPose: realtime multi-person 2D pose estimation using part affinity fields. arXiv preprint arXiv:1812.08008 (2018)
Cao, Z., Simon, T., Wei, S.E., Sheikh, Y.: Realtime multi-person 2D pose estimation using part affinity fields. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 7291–7299 (2017)
Chaman, S., D’souza, D., D’mello, B., Bhavsar, K., D’souza, T.: Real-time hand gesture communication system in Hindi for speech and hearing impaired. In: 2018 Second International Conference on Intelligent Computing and Control Systems (ICICCS), pp. 1954–1958. IEEE (2018)
Felix, S.M., Kumar, S., Veeramuthu, A.: A smart personal AI assistant for visually impaired people. In: 2018 2nd International Conference on Trends in Electronics and Informatics (ICOEI), pp. 1245–1250. IEEE (2018)
Hu, M., Chen, Y., Zhai, G., Gao, Z., Fan, L.: An overview of assistive devices for blind and visually impaired people. Int. J. Robot. Autom. 34(5), 580–598 (2019)
Islam, M.R., Mitu, U.K., Bhuiyan, R.A., Shin, J.: Hand gesture feature extraction using deep convolutional neural network for recognizing American sign language. In: 2018 4th International Conference on Frontiers of Signal Processing (ICFSP), pp. 115–119. IEEE (2018)
Jirasuwankul, N.: Effect of text orientation to OCR error and anti-skew of text using projective transform technique. In: 2011 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), pp. 856–861. IEEE (2011)
O’day, B.L., Killeen, M., Iezzoni, L.I.: Improving health care experiences of persons who are blind or have low vision: suggestions from focus groups. Am. J. Med. Qual. 19(5), 193–200 (2004)
Rajput, R., Borse, R.: Alternative product label reading and speech conversion: an aid for blind person. In: 2017 International Conference on Computing, Communication, Control and Automation (ICCUBEA), pp. 1–6. IEEE (2017)
Sabab, S.A., Ashmafee, M.H.: Blind reader: an intelligent assistant for blind. In: 2016 19th International Conference on Computer and Information Technology (ICCIT), pp. 229–234. IEEE (2016)
Simon, T., Joo, H., Matthews, I., Sheikh, Y.: Hand keypoint detection in single images using multiview bootstrapping. In: CVPR (2017)
Sun, J.H., Ji, T.T., Zhang, S.B., Yang, J.K., Ji, G.R.: Research on the hand gesture recognition based on deep learning. In: 2018 12th International Symposium on Antennas, Propagation and EM Theory (ISAPE), pp. 1–4. IEEE (2018)
Wei, S.E., Ramakrishna, V., Kanade, T., Sheikh, Y.: Convolutional pose machines. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4724–4732 (2016)
Yi, C., Tian, Y., Arditi, A.: Portable camera-based assistive text and product label reading from hand-held objects for blind persons. IEEE/ASME Trans. Mechatron. 19(3), 808–817 (2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Lu, Q., Zhai, G., Min, X., Zhu, Y. (2020). A Reading Assistant System for Blind People Based on Hand Gesture Recognition. In: Zhai, G., Zhou, J., Yang, H., An, P., Yang, X. (eds) Digital TV and Wireless Multimedia Communication. IFTC 2019. Communications in Computer and Information Science, vol 1181. Springer, Singapore. https://doi.org/10.1007/978-981-15-3341-9_17
Download citation
DOI: https://doi.org/10.1007/978-981-15-3341-9_17
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-3340-2
Online ISBN: 978-981-15-3341-9
eBook Packages: Computer ScienceComputer Science (R0)