skip to main content
10.1145/2513383.2513440acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article

Exploring the use of speech input by blind people on mobile devices

Published: 21 October 2013 Publication History

Abstract

Much recent work has explored the challenge of nonvisual text entry on mobile devices. While researchers have attempted to solve this problem with gestures, we explore a different modality: speech. We conducted a survey with 169 blind and sighted participants to investigate how often, what for, and why blind people used speech for input on their mobile devices. We found that blind people used speech more often and input longer messages than sighted people. We then conducted a study with 8 blind people to observe how they used speech input on an iPod compared with the on-screen keyboard with VoiceOver. We found that speech was nearly 5 times as fast as the keyboard. While participants were mostly satisfied with speech input, editing recognition errors was frustrating. Participants spent an average of 80.3% of their time editing. Finally, we propose challenges for future work, including more efficient eyes-free editing and better error detection methods for reviewing text.

References

[1]
AirServer. http://www.airserver.com/.
[2]
Apple Inc., AirPlay. http://www.apple.com/airplay/
[3]
Apple Inc., Chapter 11: Using VoiceOver Gestures. From VoiceOver Getting Started. http://www.apple.com/voiceover/info/guide/_1137.html#vo28035.
[4]
Apple Inc., Siri. http://www.apple.com/ios/siri/
[5]
Azenkot, S., Wobbrock, J. O., Prasain, S., and Ladner, R. E. Input finger detection for nonvisual touch screen text entry in Perkinput. Proc. GI '12, 121--129.
[6]
Bonner, M., Brudvik, J., Abowd, G. Edwards, K. (2010). No-Look Notes: Accessible Eyes-Free Multi-Touch Text Entry. Proc. Pervasive '10, 409--426.
[7]
Castellucci, S., and MacKenzie, I. S. (2011). Gathering text entry metrics on android devices. Proc. CHI EA '11, 1507--1512.
[8]
Fischer, A. R. H., Price, K. J., and Sears, A. Speech-based Text Entry for Mobile Handheld Devices: An analysis of efficacy and error correction techniques for server-based solutions. International Journal of Human-Computer Interaction 19, 3 (2005), 279--304.
[9]
Fleksy App by Syntellia. http://fleksy.com/
[10]
Frey, B., Southern, C., and Romero, M. BrailleTouch: Mobile Texting for the Visually Impaired. Proc. UAHCI'11, 19--25.
[11]
Google. Voice Search Anywhere. http://www.google.com/insidesearch/features/voicesearch/index-chrome.html
[12]
Halverson, C., Horn, D., Karat, C., and Karat, J. The beauty of errors: patterns of error correction in desktop speech systems. IOS Press (1999), 133--140.
[13]
Kane, S. K., Bigham, J. P. and Wobbrock, J. O. (2008). Slide Rule: Making mobile touch screens accessible to blind people using multitouch interaction techniques. Proc. ASSETS '08, 73--80.
[14]
Kane, S. K., Wobbrock, J. O. and Ladner, R. E. (2011). Usable gestures for blind people: understanding preference and performance. Proc. CHI '11, 413--422.
[15]
Karat, C.-M., Halverson, C., Horn, D., and Karat, J. Patterns of entry and correction in large vocabulary continuous speech recognition systems. Proc. CHI'99, 568--575.
[16]
Kumar, A., Paek, T., and Lee, B. Voice typing: a new speech interaction model for dictation on touchscreen devices. Proc. CHI '12, 2277--2286.
[17]
MacKenzie, I. S. and Zhang, S. X. (1999). The design and evaluation of a high-performance soft keyboard. Proc. CHI '99, 25--31.
[18]
Mascetti, S., Bernareggi, C., and Belotti, M. (2011). TypeInBraille: a braille-based typing application for touchscreen devices. Proc. ASSETS '11, 295--296.
[19]
Martin, T. B. and Welch, J. R. Practical speech recognizers and some performance effectiveness parameters. In Trends in Speech Recognition. Prentice Hall, Englewood Cliffs, NJ, USA, 1980.
[20]
Mishra, T., Ljolje, A., Gilbert, Mazin. (2011). Predicting Human Perceived Accuracy of ASR Systems. Proc. INTERSPEECH '11, 1945--1948.
[21]
Nuance. Dragon Naturally Speaking Software. http://www.nuance.com/dragon/index.htm
[22]
Oliveira, J., Guerreiro, T., Nicolau, H., Jorge, J., and Gonçalves, D. (2011). Blind people and mobile touch-based text-entry: acknowledging the need for different flavors. Proc. ASSETS '11, 179--186.
[23]
Oviatt, S. Taming recognition errors with a multimodal interface. Commun. ACM 43, 9 (2000), 45--51.
[24]
Pitt, I., and Edwards, A. D. N. (1996). Improving the usability of speech-based interfaces for blind users. Proc. ASSETS '96, 124--130.
[25]
Sánchez, J. and Aguayo, F. (2006), Mobile messenger for the blind. Proc. UAAI '06, 369--385
[26]
Sears, A., Karat, C.-M., Oseitutu, K., Karimullah, A., and Feng, J. Productivity, satisfaction, and interaction strategies of individuals with spinal cord injuries and traditional users interacting with speech recognition software. Universal Access in the Information Society 1, 1 (2001), 4--15.
[27]
Southern, C., Clawson, J., Frey, B., Abowd, B., and Romero, M. 2012. An evaluation of BrailleTouch: mobile touchscreen text entry for the visually impaired. Proc. MobileHCI '12, 317--326.
[28]
Stent, A., Syrdal, A., and Mishra, T. 2011. On the intelligibility of fast synthesized speech for individuals with early-onset blindness. Proc. ASSETS '11, 211--218.
[29]
Suhm, B., Myers, B., and Waibel, A. Multi-Modal Error Correction for Speech User Interfaces. ACM TOCHI 8, 1 (2001), 60--98.
[30]
Tinwala, H, and MacKenzie, I. S. (2009). Eyes-free text entry on a touchscreen phone. Proc. TIC-STH '09, 83--88.
[31]
Williams, J. R. (1998). Guidelines for the use of multimedia in instruction, Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting, 1447--1451.
[32]
Wobbrock, J. O. (2007). Measures of text entry performance. In Text Entry Systems: Mobility, Accessibility, Universality, I. S. MacKenzie and K. Tanaka-Ishii (eds.). San Francisco: Morgan Kaufmann, 47--74.
[33]
Wobbrock, J. O. and Myers, B. A. Analyzing the input stream for character- level errors in unconstrained text entry evaluations. ACM TOCHI. 13, 4 (2006), 458--489.
[34]
Yfantidis, G. and Evreinov, G. Adaptive blind interaction technique for touchscreens, Universal Access in the Information Society, 4, 2006, 328--337.

Cited By

View all
  • (2025)From student feedback to design: Revisiting mobile courseware interfacesAdvances in Mobile Learning Educational Research10.25082/AMLER.2025.01.0055:1(1284-1300)Online publication date: 14-Jan-2025
  • (2025)Breaking down barriersInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103403194:COnline publication date: 1-Feb-2025
  • (2025)Limitations in speech recognition for young adults with down syndromeUniversal Access in the Information Society10.1007/s10209-025-01197-4Online publication date: 15-Feb-2025
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ASSETS '13: Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility
October 2013
343 pages
ISBN:9781450324052
DOI:10.1145/2513383
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 21 October 2013

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. dictation
  2. eyes-free
  3. mobile devices
  4. text entry

Qualifiers

  • Research-article

Funding Sources

Conference

ASSETS '13
Sponsor:

Acceptance Rates

ASSETS '13 Paper Acceptance Rate 28 of 98 submissions, 29%;
Overall Acceptance Rate 436 of 1,556 submissions, 28%

Upcoming Conference

ASSETS '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)127
  • Downloads (Last 6 weeks)8
Reflects downloads up to 14 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)From student feedback to design: Revisiting mobile courseware interfacesAdvances in Mobile Learning Educational Research10.25082/AMLER.2025.01.0055:1(1284-1300)Online publication date: 14-Jan-2025
  • (2025)Breaking down barriersInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103403194:COnline publication date: 1-Feb-2025
  • (2025)Limitations in speech recognition for young adults with down syndromeUniversal Access in the Information Society10.1007/s10209-025-01197-4Online publication date: 15-Feb-2025
  • (2024)Possibility of Accessing Information Via Smartphone Applications for People with Visual Impairment: A Field StudyAcademic International Journal of Social Sciences and Humanities10.59675/S214U2:1(37-72)Online publication date: 22-Jun-2024
  • (2024)Toward Effective Communication of AI-Based Decisions in Assistive Tools: Conveying Confidence and Doubt to People with Visual Impairments at Accelerated SpeechProceedings of the 21st International Web for All Conference10.1145/3677846.3677862(177-189)Online publication date: 13-May-2024
  • (2024)A Recipe for Success? Exploring Strategies for Improving Non-Visual Access to Cooking InstructionsProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675662(1-15)Online publication date: 27-Oct-2024
  • (2024)Intersecting Liminality: Acquiring a Smartphone as a Blind or Low Vision Older AdultProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675622(1-14)Online publication date: 27-Oct-2024
  • (2024)Breaking the News Barrier: Towards Understanding News Consumption Practices among BVI Individuals in IndiaProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675608(1-11)Online publication date: 27-Oct-2024
  • (2024)Accessible Gesture Typing on Smartphones for People with Low VisionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676447(1-11)Online publication date: 13-Oct-2024
  • (2024)Improving FlexType: Ambiguous Text Input for Users with Visual ImpairmentsProceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments10.1145/3652037.3652059(130-139)Online publication date: 26-Jun-2024
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media