Abstract
Mobile phones are now touch-enabled, which allows the use of on-screen keyboards for text entry. Text entry tasks are among the most frequently occurring tasks performed by mobile phone users. However, people with visual impairments find it difficult to use on-screen keyboards, and this affects their digital literacy. In this article, a text entry mechanism is proposed to solve this problem using directional movement gestures suitable for people with visual impairments. Two forms of directional movement gestures, guided and unguided movements, are first studied, and the analysis reveals that unguided directional movement gestures are more suitable for a text entry mechanism for individuals who are visually impaired. Based on this insight, a text entry mechanism called VectorEntry is developed. It uses eight unguided directional movement gestures to select characters on the keyboard of a touch-enabled mobile phone. The keyboard is designed in accordance with the traditional 4×3 telephone keypad. The results of experiments show that the average text entry rate of VectorEntry was 3.3 wpm, 83.3% higher than the state-of-the-art No-Look Notes used for similar tasks. Its average rate of error in text entry was only 0.19% per character.
- Mrim Alnfiai and Srinivas Sampalli. 2017. BrailleEnter: A touch screen Braille text entry method for the blind. Procedia Computer Science 109 (2017), 257--264.Google ScholarCross Ref
- Android. 2014. Android Operating System. Retrieved November 10, 2018 from http://www.google.com/mobile/search/ [Online; accessed: November 10, 2018].Google Scholar
- R. Aoki, R. Hashimoto, A. Miyata, S. Seko, M. Watanabe, and M. Ihara. 2014. Move&Flick: Design and evaluation of a single-finger and eyes-free Kana-character entry method on touch screens. In Proceedings of the 16th International ACM SIGACCESS Conference on Computer and Accessibility, pp. 311--317.Google Scholar
- Apple. 2018. Apple’s VoiceOver. Retrieved on April 15, 2019 from http://www.apple.com/accessibility/iphone/vision.html.Google Scholar
- Shiri Azenkot and B. Lee, Nicole. 2013. Exploring the use of speech input by blind people on mobile devices. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, Article No. 11.Google Scholar
- Shiri Azenkot, J. O. Wobbrock,, Sanjana Prasain, and Richard E. Ladner. 2012. Input finger detection for nonvisual touch screen text entry in Perkinput. In Graphics Interface Conference 2012. Canadian Information Processing Society, 121--129.Google Scholar
- João Benedito, Tiago Guerreiro, Hugo Nicolau, and Daniel Gonçalves. 2010. The key role of touch in non-visual mobile interaction. In Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services. ACM, 379--380.Google ScholarDigital Library
- Syed Masum Billah, Yu-Jung Ko, Vikas Ashok, Xiaojun Bi Stony, and I. V. Ramakrishnan. 2019. Accessible gesture typing for non-visual text entry on smartphones. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI’19). ACM, Paper No. 376.Google Scholar
- Conrad H. Blickenstorfer. 1995. Graffiti: Wow! Pen Computing Magazine 1 (1995), 30--31.Google Scholar
- Matthew N. Bonner, Jeremy T. Brudvik, Gregory D. Abowd, and W. Keith Edwards. 2010. No-look notes: Accessible eyes-free multi-touch text entry. In International Conference on Pervasive Computing. Springer, 409--426.Google Scholar
- L. Braille. 1829. Procedure for writing words, music and plain song using dots for the use of the blind and made available to them. Royal Institution of Blind Youth, Paris (1829).Google Scholar
- Braille Literacy. 2019. National Braille Press. Retrieved November 21, 2019 from https://www.nbp.org/ic/nbp/about/aboutbraille/needforbraille.html.Google Scholar
- Steven J. Castellucci and I. Scott MacKenzie. 2008. Graffiti vs. unistrokes: An empirical comparison. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 305--308.Google ScholarDigital Library
- Tuhin Chakraborty and Debasis Samanta. 2013. Exploring an effective interaction mode for blind mobile users in India. In Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction. ACM, 371--378.Google ScholarDigital Library
- T. D. Cook and D. T. Campbell. 1979. Quasi-experimentation: Design and Analysis Issues for Field Settings. Houghton Mifflin Company, Boston, MA.Google Scholar
- Rafael Jeferson Pezzuto Damaceno, Juliana Cristina Braga, and Jesús Pascual Mena Chalco. 2016. Mobile device accessibility for the visually impaired: Problems mapping and empirical study of touch screen gestures. In Proceedings of the 15th Brazilian Symposium on Human Factors in Computing Systems (IHC’16). ACM, Article No. 2.Google ScholarDigital Library
- Jim Denham. 2004. An accessible, pricey answer: A review of the mobile phone organizer. AFB AccessWorld Magazine. Retrieved from http://www.afb.org/afbpress/pub.asp?DocID=aw050305.Google Scholar
- Carlos Duarte, Simon Desart, David Costa, and Bruno Dumas. 2017. Designing multimodal mobile interaction for a text messaging application for visually impaired users. Frontiers in ICT 4 (2017), 26.Google ScholarCross Ref
- Kevin J. Flannelly, Laura T. Flannelly, and Katherine R. B. Jankowski. 2018. Threats to the internal validity of experimental and quasi-experimental research in healthcare. Health Care Chaplaincy 24 (2018), 107--130.Google ScholarCross Ref
- Brian Frey, Caleb Southern, and Mario Romero. 2011. Brailletouch: Mobile texting for the visually impaired. In International Conference on Universal Access in Human-Computer Interaction. Springer, 19--25.Google ScholarDigital Library
- Gartner. 2015. Habbits with smartphone in 2014. Retrieved May 15, 2018 from http://www.gartner.com/newsroom/id/2996817.Google Scholar
- David Goldberg and Cate Richardson. 1993. Touch-typing with a stylus. In Proceedings of the INTERACT ’93 and CHI ’93 Conference on Human Factors in Computing Systems. ACM, 80--87.Google ScholarDigital Library
- Google. 2014. Google’s VoiceActions. Retrieved May 16, 2018 from http://code.google.com/p/eyes-free.Google Scholar
- William Grussenmeyer and Eelke Folmer. 2017. Accessible touchscreen technology for people with visual impairments: A survey. ACM Transactions on Accessible Computing (TACCESS) 9, 2 (2017), 6.Google Scholar
- Tiago Guerreiro, Hugo Nicolau, Joaquim Jorge, and Daniel Gonçalves. 2012. Mobile text-entry: The unattainable ultimate method. Pervasive 2012 Workshop on Frontiers in Accessibility for Pervavise Computing.Google Scholar
- Noah Hearle. 2011. Sentence and word length. Retrieved on April 15, 2019 from http://ds.nahoo.net/Academic/Maths/Sentence.html.Google Scholar
- Jonggi Hong and Leah Findlater. 2018. Identifying speech input errors through audio-only interaction. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 567--578.Google ScholarDigital Library
- IntexVision. 2014. Intex-vision mobile phone for blind. Retrieved January 5, 2018 from http://intextechnologies.com/vani/vani-april-june 11.pdf.Google Scholar
- iPhone. 2014. Apple’s iPhone. Retrieved January 12, 2018 from http://www.apple.com/iphone/iphone-4s/specs .html.Google Scholar
- ITU. 2002. E.161: Arrangement of digits, letters and symbols on telephones and other devices that can be used for gaining access to a telephone network. Retrieved on April 15, 2019 from http://www.itu.int/rec/T-REC-E.161-200102-I/en.Google Scholar
- Shaun K. Kane, Jacob O. Wobbrock, and Richard E. Ladner. 2011. Usable gestures for blind people: Understanding preference and performance. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 413--422.Google Scholar
- Akiyo Kano, Janet C. Read, and Alan Dix. 2006. Children’s phrase set for text input method evaluations. In Proceedings of the 4th Nordic Conference on Human-Computer Interaction: Changing Roles. ACM, 449--452.Google ScholarDigital Library
- Gordon Kurtenbach and William Buxton. 1993. The limits of expert performance using hierarchic marking menus. In Proceedings of the INTERACT ’93 and CHI ’93 Conference on Human Factors in Computing Systems. ACM, 482--487.Google ScholarDigital Library
- Mingzhe Li, Mingming Fan, and Khai N. Truong. 2017. BrailleSketch: A gesture-based text input method for people with visual impairments. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 12--21.Google Scholar
- I. Scott MacKenzie and R. William Soukoreff. 2003. Phrase sets for evaluating text entry techniques. In Extended Abstracts on Human Factors in Computing Systems, CHI ’03. ACM, 754--755.Google Scholar
- I. Scott MacKenzie and Kumiko Tanaka-Ishii. 2010. Text Entry Systems: Mobility, Accessibility, Universality. Elsevier.Google ScholarDigital Library
- I. Scott MacKenzie and Shawn X. Zhang. 1997. The immediate usability of Graffiti. In Graphics Interface, Canadian Information Processing Society, Vol. 97. 129--137.Google Scholar
- Elke Mattheiss, Georg Regal, Johann Schrammel, Markus Garschall, and Manfred Tscheligi. 2015. EdgeBraille: Braille-based text input for touch devices. Journal of Assistive Technologies 9, 3 (2015), 147--158.Google ScholarCross Ref
- M. S. Mayzner and M. E. Tresselt. 1965. Table of Single-Letter and Digram Frequency Counts for Various Word-Length and Letter-Position Combinations. Psychonomic Monograph Supplements. 13--32 pages.Google Scholar
- Ian McGraw, Rohit Prabhavalkar, and Alvarez Raziel, et al. 2016. Personalized speech recognition on mobile devices. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP’16), IEEE.Google ScholarDigital Library
- Jonas Moll and Eva-Lotta Sallnäs Pysander. 2013. A haptic tool for group work on geometrical concepts engaging blind and sighted pupils. ACM Transactions on Accessible Computing (TACCESS) 4, 4 (2013), 14.Google Scholar
- Nielson. 2016. Smart phone users in India. Retrieved May 15, 2018 from http://www.nielsen.com/in/en/nielsen-pressroom/.Google Scholar
- Nuance. 2014. Nuance. Retrieved December 11, 2017 from http://www.nuance.com/.Google Scholar
- João Oliveira, Tiago Guerreiro, Hugo Nicolau, Joaquim Jorge, and Daniel Gonçalves. 2011a. Blind people and mobile touch-based text-entry: Acknowledging the need for different flavors. In Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 179--186.Google ScholarDigital Library
- João Oliveira, Tiago Guerreiro, Hugo Nicolau, Joaquim Jorge, and Daniel Gonçalves. 2011b. BrailleType: Unleashing Braille over touch screen mobile phones. In IFIP Conference on Human-Computer Interaction. Springer, 100--107.Google ScholarCross Ref
- L. Pahus, P. R. Burgel, N. Roche, J. L. Paillasseur, and P. Chanez. 2019. Randomized controlled trials of pharmacological treatments to prevent COPD exacerbations: Applicability to real-life patients. BMC Pulmonary Medicine (2019).Google Scholar
- André Rodrigues, Hugo Nicolau, Kyle Montague, Joäo Guerreiro, and Tiago Guerreiro. 2019. Open challenges of blind people with smartphones. Retrieved September 30, 2019 from https://arxiv.org/pdf/1909.09078.pdf.Google Scholar
- Mario Romero, Brian Frey, Caleb Southern, and Gregory D. Abowd. 2011. BrailleTouch: Designing a mobile eyes-free soft keyboard. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. ACM, 707--709.Google Scholar
- Sherry Ruan, Jacob O. Wobbrock, Kenny Liou, Andrew Ng, and James A. Landay. 2017. Comparing speech and keyboard text entry for short messages in two languages on touchscreen phones. In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. ACM, Vol. 1, No. 4, Article 159.Google ScholarDigital Library
- Jaime Sánchez, Mauricio Sáenz, and Jose Miguel Garrido. 2010. Usability of a multimodal video game to improve navigation skills for blind children. ACM Transactions on Accessible Computing (TACCESS) 3, 2 (2010), 7.Google Scholar
- A. F. Sanders, G. E. Stelmach, and J. Requin. 1980. Stage analysis of reaction process. Tutorials in Motor Behavior (1980), 331--354.Google Scholar
- Manoj Kumar Sharma and Debasis Samanta. 2014. Word prediction system for text entry in Hindi. ACM Transactions on Asian Language Information Processing 13, 2 (2014), 8.Google ScholarDigital Library
- Siri. 2014. Apple’s SIRI. Retrieved April 19, 2018 from http://www.apple.com/ios/siri/.Google Scholar
- Caleb Southern, James Clawson, Brian Frey, Gregory Abowd, and Mario Romero. 2012. An evaluation of BrailleTouch: Mobile touchscreen text entry for the visually impaired. In Proceedings of the 14th International Conference on Human-Computer Interaction with Mobile Devices and Services. ACM, 317--326.Google ScholarDigital Library
- Statistia. 2014. Smartphone usage forecast 2010-2018. Retrieved May 15, 2018 from http://www.statistia.com/statistics/269912/.Google Scholar
- Daniel Trindade, André Rodrigues, Tiago Guerreiro, and Hugo Nicolau. 2018. Hybrid-Brailler: Combining physical and gestural interaction for mobile Braille input and editing. CHI (2018).Google ScholarDigital Library
- Radu-Daniel Vatavu. 2017. Visual impairments and mobile touchscreen interaction: State-of-the-Art, causes of visual impairment, and design guidelines. International Journal of Human--Computer Interaction 33, 6 (2017), 486--509.Google ScholarCross Ref
- VectorEntry. 2019. VectorEntry Android App. Retrieved April 30, 2019 from http://cse.iitkgp.ac.in/dsamanta/research_development/text_entry_systems.html.Google Scholar
- WHO. 2016. Blindness in developing countries. Retrieved April 10, 2018 from http://webaim.org/projects/screenreadersurvey5/.Google Scholar
- WHO. 2018. Blindness and visual impairment. Retrieved May 15, 2018 from http://www.who.int/mediacentre/factsheets/fs282/en/.Google Scholar
Index Terms
- VectorEntry: Text Entry Mechanism Using Handheld Touch-Enabled Mobile Devices for People with Visual Impairments
Recommendations
Fast Human-Computer Interaction by Combining Gaze Pointing and Face Gestures
In this work, we show how our open source accessibility software, the FaceSwitch, can help motor-impaired subjects to efficiently interact with a computer hands-free. The FaceSwitch enhances gaze interaction with video-based face gestures interaction. ...
A Head-Tracker Based on the Lucas-Kanade Optical Flow Algorithm
Proceedings of the 2006 conference on Advances in Intelligent IT: Active Media Technology 2006Technology is advancing at a rapid pace, automating many everyday chores in the process, changing the way we perform work and providing various forms of entertainment. Makers of technology, however, often do not consider the needs of the disabled in ...
INDOOR WAYFINDING BASED ON WIRELESS SENSOR NETWORKS FOR INDIVIDUALS WITH MULTIPLE SPECIAL NEEDS
Wayfinding systems are an assistive technology targeting disabled patients who are mobile and need to travel in both indoor and outdoor environments, thus increasing workplace and life independence. Such a personal guidance system can improve the ...
Comments