Skip to main content
Log in

Multi-touch gesture recognition of Braille input based on Petri Net and RBF Net

  • 1182: Deep Processing of Multimedia Data
  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

The development of information accessibility is receiving more and more attention. One challenging task for the blind is to input Braille while by no way could they sense the location information on touch screens. The existing Braille input methods are suffering from problems including inaccurate positioning and lack of interactive prompts. In this paper, touch gestures are recognized by trained RBF network while combined gestures are modeled by Petri net that introduces logic, timing and spatial relationship description. By doing so, the Braille input concerning multi-touch gesture recognition is then implemented. The experimental results show that the method is effective and blind people can friendly Braille Input with almost real-time interaction. The input method makes full use of the inherent logic of Braille, making the blind easy to learn and remember, providing a new method for human-computer interaction between the blind and the touch screen.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Alnfiai M, Sampalli S (2016) An evaluation of SingleTapBraille keyboard: a text entry method that utilizes braille patterns on touchscreen devices. In: The 18th international ACM SIGACCESS conference, pp 161–169

  2. Alnfiai M, Sampalli S (2017) Brailleenter: a touch screen braille text entry method for the blind. Procedia Comput Sci 109:257–264

    Article  Google Scholar 

  3. Chen HF, Xu SH, Wang JL (2013) A Braille input method based on gesture recognition: JiangSu, CN102929394A

  4. Feng L, Ye-Han W, Bu-Zhou T, Xiao-Long W, Xuan W (2011) Intelligent chinese input method based on android. Comput Eng 37(07):225–227

    Google Scholar 

  5. Frey B, Southern C, Romero M (2011) BrailleTouch: Mobile Texting for the Visually Impaired. In: International conference on universal access in human-computer interaction: context diversity, pp 19–25

  6. Fuccella V, De Rosa M, Costagliola G (2014) Novice and expert performance of keyscretch: a gesture-based text entry method for touch-screens. IEEE Trans Human-Mach Syst 44(4):511–523

    Article  Google Scholar 

  7. Fukatsu Y, Shizuki B, Tanaka J (2013) No-look flick: single-handed and eyes-free japanese text input system on touch screens of mobile devices. pp 161-170

  8. Hu YP (2014) A method and device for output and input of Braille characters on touch screen. BeiJing: CN103870008A

  9. Izonin I, Tkachenko R, Kryvinska N, Gregus M, Tkachenko P, Vitynskyi P (2019) Committee of SGTM neural-like structures with RBF kernel for insurance cost prediction task. In: 2019 IEEE 2nd Ukraine conference on electrical and computer engineering, pp 1037–1040

  10. Jalaliniya S, Mardanbegi D, Sintos I, et al. (2015) EyeDroid: an open source mobile gaze tracker on Android for eyewear computers. In: Adjunct Proceedings of the 2015 ACM international joint conference on pervasive and ubiquitous computing and proceedings of the 2015, pp 873–879

  11. Jim HY (2011) Human-computer interaction research based on multi-touch technology. Diss. East China Normal Unversity

  12. Juxiao Z, Xiaoqin Z, Zhaohui M (2015) Design and implementation of multi-touch input method for blind usage. Comput Appl Softw 10(2015):231–235

    Google Scholar 

  13. Kong BY, Lee J, Park IC (2019) A low-latency multi-touch detector based on concurrent processing of redesigned overlap split and connected component analysis. In: IEEE transactions on circuits and systems I: Regular Papers, pp 1–11

  14. Li Q, Cao H, Lu Y, Yan H, Li T (2017) Controlling non-touch screens as touch screens using Airpen, a writing tool with in-air gesturing mode. In: International symposium on system & software reliability. IEEE, pp 68–76

  15. Li WSH, Deng CHJ, Lv Y (2011) Interaction gesture analysis based on touch screen. Chin J Liq Cryst Disp 26(2):194–199

    Article  Google Scholar 

  16. Mascetti S, Bernareggi C, Belotti M et al (2011) TypeInBraille: a braille-based typing application for touchscreen devices. In: Conference on computers and accessibility (ASSETS), vol 2011, pp 295–296

  17. Mattheiss EE, Regal G, Schrammel J et al (2015) Edgebraille: braille-based text input for touch devices. J Assist Technol 9(3):147–158

    Article  Google Scholar 

  18. Nicolau H, Guerreiro T, Jorge J, et al. (2010) Proficient blind users and mobile text-entry. In: Proceedings of the 28th annual european conference on cognitive ergonomics, New York, USA, vol 2010, pp 19–22

  19. Oliveira J, Guerreiro T, Nicolau H, Jorge J, Gonalves D (2011) BrailleType: unleashing braille over touch screen mobile phones. In: IFIP conference on human-computer interaction. Springer, Berlin, pp 100–107

  20. Rzecki K, Siwik L, Baran M (2019) The elastic k-nearest neighbours classifier for touch screen gestures. Artif Intell Soft Comput 11508:608–615

    Article  Google Scholar 

  21. Shin H, Lim JM, Oh C, Kim M, Son JY (2015) Performance comparison of tap gestures on small-screen touch devices. In: IEEE International conference on consumer electronics. IEEE, pp 120–121

  22. Siqueira J, Fabrizzio Alphonsus Alves de Melo Nunes Soares, Silva CRG, Berretta LDO, Luna MM (2016) BrailleÉcran: a Braille approach to text entry on smartphones. In: Computer software & applications conference. IEEE, pp 608–609

  23. Subash NS, Nambiar S, Kumar V (2013) BrailleKey: An alternative Braille text input system: comparative study of an innovative simplified text input system for the visually impaired. In: 2012 4th International conference on intelligent human computer interaction (IHCI). IEEE, pp 1–4

  24. Tkachenko R, Tkachenko P, Izonin I, Vitynskyi P, Tsymbal Y (2019) Committee of the combined a-SGTM neural-like structures for prediction tasks. In: Mobile web and intelligent information systems, pp 267–277

  25. Vatavu R-D, Acm (2017) Improving gesture recognition accuracy on touch screens for users with low vision. In: Proceedings of the 2017 Acm Sigchi conference on human factors in computing systems, pp 4667–4679

  26. Wang DX, Shi CL, Zhang MJ (2010) Multi-touch gesture recognition based on petri net and back propagation neural networks. Moshi Shibie yu Rengong Zhineng/Pattern Recognit Artif Intell 23(3):408–413

    Google Scholar 

  27. Wang XQ, Chen G, Wang D, Wang C (2012) Research on multi-touch gesture analysis and recognition algorithm. Comput Sci 39(S1):522–525

    Google Scholar 

  28. Xiliang Y, Liming W (2017) Handwritten chinese character recognition system based on neural network convolution depth. Comput Eng Appl 53 (10):246–250

    Google Scholar 

  29. Yu TZH (2013) The design and implementation of cross-platform stroke input method engine. Harbin Institute of Technology

  30. Yu Y, Meng X, Chen Z (2020) Research and improvement on the linkage of 2D and 3D based on multi-touch technology. Cyber Secur Intell Anal 928:462–468

    Google Scholar 

  31. Yuandan Z, Zhisheng C, Laisheng X (2019) Research and implementation of a voice control audio system based on android speech recognition. Modern Electron Technique :93–96

Download references

Acknowledgements

This work was supported by The Major Programs of Natural Science Foundation of the Jiangsu Higher Education Institutions of China (No. 19KJA310002) , The Natural Science Foundation of the Jiangsu Higher Education Institutions of China (No. 17KJD520006) and The Programs of Sign Language & Braille of China Disabled Persons’ Federation (No. CLM2020-01).

Section 1.1, 2.2.1 and 3.2 have been published at 2nd EAI International Conference on Multimedia Technology and Enhanced Learning. (J. X. Zhang, X. Q. Zeng, and ZH. S. Zhu.Multi-Touch Gesture Recognition of Braille Input Based on RBF Net. EAI ICMTEL 2020, Leicester, Great Britain, 2020-04-10).

The added content of this paper:

(1) It uses Petri Net to describe the temporal relationships of multiple touches and uses RBF Net to realize multi-touch trajectories recognition, which improves the accuracy of touch locations and the efficiency of human-computer interaction between blind people and touch screen.

(2) It supplements with pertinent experimental data (Sect. 4) and all tables.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Juxiao Zhang.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, J., Zeng, X. Multi-touch gesture recognition of Braille input based on Petri Net and RBF Net. Multimed Tools Appl 81, 19395–19413 (2022). https://doi.org/10.1007/s11042-021-11156-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-021-11156-9

Keywords

Navigation