skip to main content
research-article

VectorEntry: Text Entry Mechanism Using Handheld Touch-Enabled Mobile Devices for People with Visual Impairments

Published:03 August 2020Publication History
Skip Abstract Section

Abstract

Mobile phones are now touch-enabled, which allows the use of on-screen keyboards for text entry. Text entry tasks are among the most frequently occurring tasks performed by mobile phone users. However, people with visual impairments find it difficult to use on-screen keyboards, and this affects their digital literacy. In this article, a text entry mechanism is proposed to solve this problem using directional movement gestures suitable for people with visual impairments. Two forms of directional movement gestures, guided and unguided movements, are first studied, and the analysis reveals that unguided directional movement gestures are more suitable for a text entry mechanism for individuals who are visually impaired. Based on this insight, a text entry mechanism called VectorEntry is developed. It uses eight unguided directional movement gestures to select characters on the keyboard of a touch-enabled mobile phone. The keyboard is designed in accordance with the traditional 4×3 telephone keypad. The results of experiments show that the average text entry rate of VectorEntry was 3.3 wpm, 83.3% higher than the state-of-the-art No-Look Notes used for similar tasks. Its average rate of error in text entry was only 0.19% per character.

References

  1. Mrim Alnfiai and Srinivas Sampalli. 2017. BrailleEnter: A touch screen Braille text entry method for the blind. Procedia Computer Science 109 (2017), 257--264.Google ScholarGoogle ScholarCross RefCross Ref
  2. Android. 2014. Android Operating System. Retrieved November 10, 2018 from http://www.google.com/mobile/search/ [Online; accessed: November 10, 2018].Google ScholarGoogle Scholar
  3. R. Aoki, R. Hashimoto, A. Miyata, S. Seko, M. Watanabe, and M. Ihara. 2014. Move&Flick: Design and evaluation of a single-finger and eyes-free Kana-character entry method on touch screens. In Proceedings of the 16th International ACM SIGACCESS Conference on Computer and Accessibility, pp. 311--317.Google ScholarGoogle Scholar
  4. Apple. 2018. Apple’s VoiceOver. Retrieved on April 15, 2019 from http://www.apple.com/accessibility/iphone/vision.html.Google ScholarGoogle Scholar
  5. Shiri Azenkot and B. Lee, Nicole. 2013. Exploring the use of speech input by blind people on mobile devices. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, Article No. 11.Google ScholarGoogle Scholar
  6. Shiri Azenkot, J. O. Wobbrock,, Sanjana Prasain, and Richard E. Ladner. 2012. Input finger detection for nonvisual touch screen text entry in Perkinput. In Graphics Interface Conference 2012. Canadian Information Processing Society, 121--129.Google ScholarGoogle Scholar
  7. João Benedito, Tiago Guerreiro, Hugo Nicolau, and Daniel Gonçalves. 2010. The key role of touch in non-visual mobile interaction. In Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services. ACM, 379--380.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Syed Masum Billah, Yu-Jung Ko, Vikas Ashok, Xiaojun Bi Stony, and I. V. Ramakrishnan. 2019. Accessible gesture typing for non-visual text entry on smartphones. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI’19). ACM, Paper No. 376.Google ScholarGoogle Scholar
  9. Conrad H. Blickenstorfer. 1995. Graffiti: Wow! Pen Computing Magazine 1 (1995), 30--31.Google ScholarGoogle Scholar
  10. Matthew N. Bonner, Jeremy T. Brudvik, Gregory D. Abowd, and W. Keith Edwards. 2010. No-look notes: Accessible eyes-free multi-touch text entry. In International Conference on Pervasive Computing. Springer, 409--426.Google ScholarGoogle Scholar
  11. L. Braille. 1829. Procedure for writing words, music and plain song using dots for the use of the blind and made available to them. Royal Institution of Blind Youth, Paris (1829).Google ScholarGoogle Scholar
  12. Braille Literacy. 2019. National Braille Press. Retrieved November 21, 2019 from https://www.nbp.org/ic/nbp/about/aboutbraille/needforbraille.html.Google ScholarGoogle Scholar
  13. Steven J. Castellucci and I. Scott MacKenzie. 2008. Graffiti vs. unistrokes: An empirical comparison. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 305--308.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Tuhin Chakraborty and Debasis Samanta. 2013. Exploring an effective interaction mode for blind mobile users in India. In Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction. ACM, 371--378.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. T. D. Cook and D. T. Campbell. 1979. Quasi-experimentation: Design and Analysis Issues for Field Settings. Houghton Mifflin Company, Boston, MA.Google ScholarGoogle Scholar
  16. Rafael Jeferson Pezzuto Damaceno, Juliana Cristina Braga, and Jesús Pascual Mena Chalco. 2016. Mobile device accessibility for the visually impaired: Problems mapping and empirical study of touch screen gestures. In Proceedings of the 15th Brazilian Symposium on Human Factors in Computing Systems (IHC’16). ACM, Article No. 2.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Jim Denham. 2004. An accessible, pricey answer: A review of the mobile phone organizer. AFB AccessWorld Magazine. Retrieved from http://www.afb.org/afbpress/pub.asp?DocID=aw050305.Google ScholarGoogle Scholar
  18. Carlos Duarte, Simon Desart, David Costa, and Bruno Dumas. 2017. Designing multimodal mobile interaction for a text messaging application for visually impaired users. Frontiers in ICT 4 (2017), 26.Google ScholarGoogle ScholarCross RefCross Ref
  19. Kevin J. Flannelly, Laura T. Flannelly, and Katherine R. B. Jankowski. 2018. Threats to the internal validity of experimental and quasi-experimental research in healthcare. Health Care Chaplaincy 24 (2018), 107--130.Google ScholarGoogle ScholarCross RefCross Ref
  20. Brian Frey, Caleb Southern, and Mario Romero. 2011. Brailletouch: Mobile texting for the visually impaired. In International Conference on Universal Access in Human-Computer Interaction. Springer, 19--25.Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Gartner. 2015. Habbits with smartphone in 2014. Retrieved May 15, 2018 from http://www.gartner.com/newsroom/id/2996817.Google ScholarGoogle Scholar
  22. David Goldberg and Cate Richardson. 1993. Touch-typing with a stylus. In Proceedings of the INTERACT ’93 and CHI ’93 Conference on Human Factors in Computing Systems. ACM, 80--87.Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Google. 2014. Google’s VoiceActions. Retrieved May 16, 2018 from http://code.google.com/p/eyes-free.Google ScholarGoogle Scholar
  24. William Grussenmeyer and Eelke Folmer. 2017. Accessible touchscreen technology for people with visual impairments: A survey. ACM Transactions on Accessible Computing (TACCESS) 9, 2 (2017), 6.Google ScholarGoogle Scholar
  25. Tiago Guerreiro, Hugo Nicolau, Joaquim Jorge, and Daniel Gonçalves. 2012. Mobile text-entry: The unattainable ultimate method. Pervasive 2012 Workshop on Frontiers in Accessibility for Pervavise Computing.Google ScholarGoogle Scholar
  26. Noah Hearle. 2011. Sentence and word length. Retrieved on April 15, 2019 from http://ds.nahoo.net/Academic/Maths/Sentence.html.Google ScholarGoogle Scholar
  27. Jonggi Hong and Leah Findlater. 2018. Identifying speech input errors through audio-only interaction. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 567--578.Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. IntexVision. 2014. Intex-vision mobile phone for blind. Retrieved January 5, 2018 from http://intextechnologies.com/vani/vani-april-june 11.pdf.Google ScholarGoogle Scholar
  29. iPhone. 2014. Apple’s iPhone. Retrieved January 12, 2018 from http://www.apple.com/iphone/iphone-4s/specs .html.Google ScholarGoogle Scholar
  30. ITU. 2002. E.161: Arrangement of digits, letters and symbols on telephones and other devices that can be used for gaining access to a telephone network. Retrieved on April 15, 2019 from http://www.itu.int/rec/T-REC-E.161-200102-I/en.Google ScholarGoogle Scholar
  31. Shaun K. Kane, Jacob O. Wobbrock, and Richard E. Ladner. 2011. Usable gestures for blind people: Understanding preference and performance. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 413--422.Google ScholarGoogle Scholar
  32. Akiyo Kano, Janet C. Read, and Alan Dix. 2006. Children’s phrase set for text input method evaluations. In Proceedings of the 4th Nordic Conference on Human-Computer Interaction: Changing Roles. ACM, 449--452.Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Gordon Kurtenbach and William Buxton. 1993. The limits of expert performance using hierarchic marking menus. In Proceedings of the INTERACT ’93 and CHI ’93 Conference on Human Factors in Computing Systems. ACM, 482--487.Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Mingzhe Li, Mingming Fan, and Khai N. Truong. 2017. BrailleSketch: A gesture-based text input method for people with visual impairments. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 12--21.Google ScholarGoogle Scholar
  35. I. Scott MacKenzie and R. William Soukoreff. 2003. Phrase sets for evaluating text entry techniques. In Extended Abstracts on Human Factors in Computing Systems, CHI ’03. ACM, 754--755.Google ScholarGoogle Scholar
  36. I. Scott MacKenzie and Kumiko Tanaka-Ishii. 2010. Text Entry Systems: Mobility, Accessibility, Universality. Elsevier.Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. I. Scott MacKenzie and Shawn X. Zhang. 1997. The immediate usability of Graffiti. In Graphics Interface, Canadian Information Processing Society, Vol. 97. 129--137.Google ScholarGoogle Scholar
  38. Elke Mattheiss, Georg Regal, Johann Schrammel, Markus Garschall, and Manfred Tscheligi. 2015. EdgeBraille: Braille-based text input for touch devices. Journal of Assistive Technologies 9, 3 (2015), 147--158.Google ScholarGoogle ScholarCross RefCross Ref
  39. M. S. Mayzner and M. E. Tresselt. 1965. Table of Single-Letter and Digram Frequency Counts for Various Word-Length and Letter-Position Combinations. Psychonomic Monograph Supplements. 13--32 pages.Google ScholarGoogle Scholar
  40. Ian McGraw, Rohit Prabhavalkar, and Alvarez Raziel, et al. 2016. Personalized speech recognition on mobile devices. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP’16), IEEE.Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Jonas Moll and Eva-Lotta Sallnäs Pysander. 2013. A haptic tool for group work on geometrical concepts engaging blind and sighted pupils. ACM Transactions on Accessible Computing (TACCESS) 4, 4 (2013), 14.Google ScholarGoogle Scholar
  42. Nielson. 2016. Smart phone users in India. Retrieved May 15, 2018 from http://www.nielsen.com/in/en/nielsen-pressroom/.Google ScholarGoogle Scholar
  43. Nuance. 2014. Nuance. Retrieved December 11, 2017 from http://www.nuance.com/.Google ScholarGoogle Scholar
  44. João Oliveira, Tiago Guerreiro, Hugo Nicolau, Joaquim Jorge, and Daniel Gonçalves. 2011a. Blind people and mobile touch-based text-entry: Acknowledging the need for different flavors. In Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 179--186.Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. João Oliveira, Tiago Guerreiro, Hugo Nicolau, Joaquim Jorge, and Daniel Gonçalves. 2011b. BrailleType: Unleashing Braille over touch screen mobile phones. In IFIP Conference on Human-Computer Interaction. Springer, 100--107.Google ScholarGoogle ScholarCross RefCross Ref
  46. L. Pahus, P. R. Burgel, N. Roche, J. L. Paillasseur, and P. Chanez. 2019. Randomized controlled trials of pharmacological treatments to prevent COPD exacerbations: Applicability to real-life patients. BMC Pulmonary Medicine (2019).Google ScholarGoogle Scholar
  47. André Rodrigues, Hugo Nicolau, Kyle Montague, Joäo Guerreiro, and Tiago Guerreiro. 2019. Open challenges of blind people with smartphones. Retrieved September 30, 2019 from https://arxiv.org/pdf/1909.09078.pdf.Google ScholarGoogle Scholar
  48. Mario Romero, Brian Frey, Caleb Southern, and Gregory D. Abowd. 2011. BrailleTouch: Designing a mobile eyes-free soft keyboard. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. ACM, 707--709.Google ScholarGoogle Scholar
  49. Sherry Ruan, Jacob O. Wobbrock, Kenny Liou, Andrew Ng, and James A. Landay. 2017. Comparing speech and keyboard text entry for short messages in two languages on touchscreen phones. In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. ACM, Vol. 1, No. 4, Article 159.Google ScholarGoogle ScholarDigital LibraryDigital Library
  50. Jaime Sánchez, Mauricio Sáenz, and Jose Miguel Garrido. 2010. Usability of a multimodal video game to improve navigation skills for blind children. ACM Transactions on Accessible Computing (TACCESS) 3, 2 (2010), 7.Google ScholarGoogle Scholar
  51. A. F. Sanders, G. E. Stelmach, and J. Requin. 1980. Stage analysis of reaction process. Tutorials in Motor Behavior (1980), 331--354.Google ScholarGoogle Scholar
  52. Manoj Kumar Sharma and Debasis Samanta. 2014. Word prediction system for text entry in Hindi. ACM Transactions on Asian Language Information Processing 13, 2 (2014), 8.Google ScholarGoogle ScholarDigital LibraryDigital Library
  53. Siri. 2014. Apple’s SIRI. Retrieved April 19, 2018 from http://www.apple.com/ios/siri/.Google ScholarGoogle Scholar
  54. Caleb Southern, James Clawson, Brian Frey, Gregory Abowd, and Mario Romero. 2012. An evaluation of BrailleTouch: Mobile touchscreen text entry for the visually impaired. In Proceedings of the 14th International Conference on Human-Computer Interaction with Mobile Devices and Services. ACM, 317--326.Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. Statistia. 2014. Smartphone usage forecast 2010-2018. Retrieved May 15, 2018 from http://www.statistia.com/statistics/269912/.Google ScholarGoogle Scholar
  56. Daniel Trindade, André Rodrigues, Tiago Guerreiro, and Hugo Nicolau. 2018. Hybrid-Brailler: Combining physical and gestural interaction for mobile Braille input and editing. CHI (2018).Google ScholarGoogle ScholarDigital LibraryDigital Library
  57. Radu-Daniel Vatavu. 2017. Visual impairments and mobile touchscreen interaction: State-of-the-Art, causes of visual impairment, and design guidelines. International Journal of Human--Computer Interaction 33, 6 (2017), 486--509.Google ScholarGoogle ScholarCross RefCross Ref
  58. VectorEntry. 2019. VectorEntry Android App. Retrieved April 30, 2019 from http://cse.iitkgp.ac.in/dsamanta/research_development/text_entry_systems.html.Google ScholarGoogle Scholar
  59. WHO. 2016. Blindness in developing countries. Retrieved April 10, 2018 from http://webaim.org/projects/screenreadersurvey5/.Google ScholarGoogle Scholar
  60. WHO. 2018. Blindness and visual impairment. Retrieved May 15, 2018 from http://www.who.int/mediacentre/factsheets/fs282/en/.Google ScholarGoogle Scholar

Index Terms

  1. VectorEntry: Text Entry Mechanism Using Handheld Touch-Enabled Mobile Devices for People with Visual Impairments

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      • Published in

        cover image ACM Transactions on Accessible Computing
        ACM Transactions on Accessible Computing  Volume 13, Issue 3
        September 2020
        152 pages
        ISSN:1936-7228
        EISSN:1936-7236
        DOI:10.1145/3415159
        Issue’s Table of Contents

        Copyright © 2020 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 3 August 2020
        • Revised: 1 June 2020
        • Accepted: 1 June 2020
        • Received: 1 October 2018
        Published in taccess Volume 13, Issue 3

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article
        • Research
        • Refereed

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format .

      View HTML Format