Skip to main content
Log in

Automatic recognition of the American sign language fingerspelling alphabet to assist people living with speech or hearing impairments

  • Original Research
  • Published:
Journal of Ambient Intelligence and Humanized Computing Aims and scope Submit manuscript

Abstract

Sign languages are natural languages used mostly by deaf and hard of hearing people. Different development opportunities for people with these disabilities are limited because of communication problems. The advances in technology to recognize signs and gestures will make computer supported interpretation of sign languages possible. There are more than 137 different sign languages around the world; therefore, a system that interprets them could be beneficial to all, especially to the Deaf Community. This paper presents a system based on hand tracking devices (Leap Motion and Intel RealSense), used for signs recognition. The system uses a Support Vector Machine for sign classification. Different evaluations of the system were performed with over 50 individuals; and remarkable recognition accuracy was achieved with selected signs (100% accuracy was achieved recognizing some signs). Furthermore, an exploration on the Leap Motion and the Intel RealSense potential as a hand tracking devices for sign language recognition using the American Sign Language fingerspelling alphabet was performed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  • Caridakis G, Asteriadis S, Karpouzis K (2014) Non-manual cues in automatic sign language recognition. Pers Ubiquit Comput 18(1): 37–46. doi:10.1007/s00779-012-0615-1

    Article  Google Scholar 

  • Carter M, Newn J, Velloso E, Vetere F (2015) Remote gaze and gesture tracking on the microsoft kinect. In: Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction on - OzCHI’15 (pp 167–176). New York: ACM Press. doi:10.1145/2838739.2838778

  • Chang CC, Lin CJ (2001) LIBSVM: a library for support vector machines. Retrieved from http://www.csie.ntu.edu.tw/?cjlin/libsvm

  • Chuan CH, Regina E, Guardino C (2014) American sign language recognition using leap motion sensor. 2014 13th International Conference on Machine Learning and Applications, 541–544. doi:10.1109/ICMLA.2014.110

  • Cortes C, Vapnik V (1995). Support-vector networks. Mach Learn, 20, 273–297. doi:10.1111/j.1747-0285.2009.00840.x

    MATH  Google Scholar 

  • Costello E (2008) Random House Webster’s Compact American Sign Language Dictionary (3 Compact). Random House Reference

  • Elakkiya R, Selvamani K (2015) An active learning framework for human hand sign gestures and handling movement epenthesis using enhanced level building approach. Procedia Computer Science, 48(Iccc): 606–611. doi:10.1016/j.procs.2015.04.142

    Article  Google Scholar 

  • Elons AS, Ahmed M, Shedid H, Tolba MF (2014) Arabic sign language recognition using leap motion sensor. In: 2014 9th International Conference on Computer Engineering & Systems (ICCES), IEEE, pp 368–373 doi:10.1109/ICCES.2014.7030987

  • Geetha M, Manjusha C, Unnikrishnan P, Harikrishnan R (2013) A vision based dynamic gesture recognition of Indian Sign Language on Kinect based depth images. Proceedings—2013 International Conference on Emerging Trends in Communication, Control, Signal Processing and Computing Applications, IEEE-C2SPCA 2013. doi:10.1109/C2SPCA.2013.6749448

  • Guna J, Jakus G, Pogačnik M, Tomažič S, Sodnik J (2014) An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors (Switzerland), 14(2), 3702–3720. doi:10.3390/s140203702

    Article  Google Scholar 

  • Hanson V (2007) Computing technologies for deaf and hard of hearing users. In: Sears A, Jacko J (eds) Human computer interaction handbook: fundamentals, evolving technologies and emerging applications, 2nd edn. CRC Press, pp 885–893

  • Ibarguren A, Maurtua I, Sierra B (2010) Layered architecture for real time sign recognition: Hand gesture and movement. Eng Appl Artif Intell, 23(7): 1216–1228. doi:10.1016/j.engappai.2010.06.001

    Article  Google Scholar 

  • Intel RealSense Technology Overview (2015) Retrieved from http://www.intel.com/content/www/us/en/architecture-and-technology/realsense-overview.html

  • Jiang F, Zhang S, Wu S, Gao Y, Zhao D (2015) Multi-layered gesture recognition with kinect. J Machine Learn Res 16:227–254

    MathSciNet  MATH  Google Scholar 

  • Karami A, Zanj B, Sarkaleh AK (2011). Persian sign language (PSL) recognition using wavelet transform and neural networks. Expert Syst Appl, 38(3): 2661–2667. doi:10.1016/j.eswa.2010.08.056

    Article  Google Scholar 

  • Knerr S, Personnaz L, Dreyfus G (1990). Single-layer learning revisited: a stepwise procedure for building and training a neural network. Neurocomputing, Springer Berlin Heidelberg, Berlin, pp 41–50. doi:10.1007/978-3-642-76153-9_5

  • Leap Motion Home Page (2015) Retrieved from https://www.leapmotion.com/

  • Lewis P, Simons G, Fennig C (2014) Ethnologue: languages of the world, 17th edn. SIL International, Dallas

  • Liwicki S, Everingham M (2009) Automatic recognition of fingerspelled words in british sign language. 2009 IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2009, (iv), 50–57. doi:10.1109/CVPR.2009.5204291

  • Mohandes M, Aliyu S, Deriche M (2014) Arabic sign language recognition using the leap motion controller. 2014 IEEE 23rd International Symposium on Industrial Electronics (ISIE), 960–965. doi:10.1109/ISIE.2014.6864742

  • Ong SCW, Ranganath S (2005) Automatic sign language analysis: a survey and the future beyond lexical meaning. IEEE Trans Pattern Anal Mach Intell, 27(6): 873–891. doi:10.1109/TPAMI.2005.112

    Article  Google Scholar 

  • Oszust M, Wysocki M (2013) Polish sign language words recognition with Kinect. 2013 6th International Conference on Human System Interactions, HSI 2013, 219–226. doi:10.1109/HSI.2013.6577826

  • Paudyal P, Banerjee A, Gupta SKS (2016) SCEPTRE: a pervasive, non-invasive, and programmable gesture recognition technology. Proceedings of the 21st International Conference on Intelligent User Interfaces, 282–293. doi:10.1145/2856767.2856794

  • Potter L, Araullo J, Carter L (2013) The Leap Motion controller: a view on sign language. Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration, 175–178. doi:10.1145/2541016.2541072

  • Scikit-learn Developers (2015) Scikit-learn: Machine Learning in Python. Retrieved from http://scikit-learn.org/stable/

  • Stefan A, Athitsos V, Alon J, Sclaroff S (2008) Translation and scale-invariant gesture recognition in complex scenes. Proceedings of the 1st ACM International Conference on PErvasive Technologies Related to Assistive Environments - PETRA’08, 1.doi: 10.1145/1389586.1389595

  • Sun C, Zhang T, Bao B-K, Xu C (2013) Latent support vector machine for sign language recognition with Kinect. IEEE International Conference on Image Processing, 6(2), 4190–4194. doi:10.1109/ICIP.2013.6738863

    Google Scholar 

  • Ten Holt GA, Reinders MJT, Hendriks EA, De Ridder H, Van Doorn AJ (2009) Influence of handshape information on automatic sign language recognition. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 5934 LNAI, 301–312. doi:10.1007/978-3-642-12553-9_27

  • Wignor D, Wixon D (2011) Brave NUI world: designing natural user interfaces for touch and gesture, 1st edn. Morgan Kaufmann Publishers Inc., San Francisco

  • Zafrulla Z, Brashear H, Starner T, Hamilton H, Presti P (2011) American sign language recognition with the kinect. Proceedings of the 13th International Conference on Multimodal Interfaces, 279–286. doi:10.1145/2070481.2070532

Download references

Acknowledgements

This work was partially supported by the Escuela de Ciencias de la Computación e Informática at Universidad de Costa Rica (ECCI-UCR) grant No. 320-B5-291, by Centro de Investigaciones en Tecnologías de la Información y Comunicación de la Universidad de Costa Rica (CITIC-UCR), and by Ministerio de Ciencia, Tecnología y Telecomunicaciones (MICITT) and Consejo Nacional para Investigaciones Científicas y Tecnológicas (CONICIT) of the Government of Costa Rica.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Luis Quesada.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Quesada, L., López, G. & Guerrero, L. Automatic recognition of the American sign language fingerspelling alphabet to assist people living with speech or hearing impairments. J Ambient Intell Human Comput 8, 625–635 (2017). https://doi.org/10.1007/s12652-017-0475-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12652-017-0475-7

Keywords

Navigation