skip to main content
10.1145/2501988.2502056acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

DigiTaps: eyes-free number entry on touchscreens with minimal audio feedback

Published: 08 October 2013 Publication History

Abstract

Eyes-free input usually relies on audio feedback that can be difficult to hear in noisy environments. We present DigiTaps, an eyes-free number entry method for touchscreen devices that requires little auditory attention. To enter a digit, users tap or swipe anywhere on the screen with one, two, or three fingers. The 10 digits are encoded by combinations of these gestures that relate to the digits' semantics. For example, the digit 2 is input with a 2-finger tap. We conducted a longitudinal evaluation with 16 people and found that DigiTaps with no audio feedback was faster but less accurate than with audio feedback after every input. Throughout the study, participants entered numbers with no audio feedback at an average rate of 0.87 characters per second, with an uncorrected error rate of 5.63%.

References

[1]
Apple Inc., iPhone Accessibility. http://www.apple.com/accessibility/iphone/vision.html. Accessed June 19, 2013.
[2]
Azenkot, S., Ladner, R.E., and Wobbrock, J.O. Smartphone haptic feedback for nonvisual wayfinding. Proc. ASSETS '11, ACM Press (2011), 28
[3]
Azenkot, S., Prasain, S., Borning, A., Fortuna, E., Ladner, R.E., and Wobbrock, J.O. Enhancing independence and safety for blind and deaf-blind public transit riders. Proc. CHI '11, ACM Press (2011), 3247--3256.
[4]
Azenkot, S., Wobbrock, J.O., Prasain, S., and Ladner, R.E. Input finger detection for nonvisual touch screen text entry in Perkinput. Proc. GI '12, Canadian Information Processing Society (2012), 121--129.
[5]
Azenkot, S., Rector, K., Ladner, R.E., and Wobbrock. J.O. PassChords: Secure Multi-Touch Authentication for Blind People. Proc. ASSETS '12. New York, NY: ACM (2012), 159--166.
[6]
Bonner, M., Brudvik, J., Abowd, G. Edwards, K. No-Look Notes: Accessible Eyes-Free Multi-Touch Text Entry. IEEE Pervasive Computing '10. Heidelberg: Springer (2010), 409--426.
[7]
Clawson, J., Lyons, K., Starner, T., and Clarkson, E. The Impacts of Limited Visual Feedback on Mobile Text Entry for the Twiddler and Mini-QWERTY Keyboards. Proc. ISWC'05, IEEE (2005), 170--177.
[8]
Frey, B., Southern, C., Romero, M. BrailleTouch: Mobile Texting for the Visually impaired. Proc. UAHCII '11. Heidelberg: Springer (2011), 19--25.
[9]
Goldberg, D. and Richardson, C. Touch-typing with a stylus. Proc. CHI '93, New York, NY: ACM (1993), 80--87.
[10]
Google, Android Accessibility. http://eyes-free.
[11]
googlecode.com/svn/trunk/documentation/android_access/index.html. Accessed June 18, 2013.
[12]
Hesselmann, T., Heuten, W., and Boll, S. Tap2Count. Proc. ITS '11, New York, NY: ACM Press (2011), 256--257.
[13]
Huffman, D. A. A method for the construction of minimum-redundancy codes. Proc. IRE (1952). 40(9), 1098--1101.
[14]
Kane, S.K., Bigham, J.P. and Wobbrock, J.O. Slide Rule: Making mobile touch screens accessible to blind people using multi-touch interaction techniques. Proc. ASSETS '08. New York: ACM Press (2008), 73--80.
[15]
Kane, S. K., Wobbrock, J. O. and Ladner, R. E. Usable gestures for blind people: understanding preference and performance. Proc. CHI '11. New York: ACM Press (2011), 413--422.
[16]
Lyons, K., Plaisted, D., and Starner, T. Expert Chording Text Entry on the Twiddler One-Handed Keyboard. Proc. ISWC '04, 94--101.
[17]
MacKenzie, I.S., Soukoreff, R.W., and Helga, J. 1 thumb, 4 buttons, 20 words per minute: design and evaluation of H4-writer. Proc. UIST'11, New York, NY: ACM Press (2011), 471--480.
[18]
MacKenzie, S., and Castellucci, S. Reducing visual demand for gestural text input on touchscreen devices. Proc. CHI EA'12. New York: ACM Press (2012), 2585--2590.
[19]
MacKenzie, I. S., and Zhang, S. X. The design and evaluation of a high-performance soft keyboard. Proc. CHI'99. New York: ACM Press (1999), 25--31.
[20]
Mascetti, S., Bernareggi, C., and Belotti, M. (2011). TypeInBraille: a braille-based typing application for touchscreen devices. Proc. ASSETS '11. New York: ACM Press (2011), 295--296.
[21]
Oliveira, J., Guerreiro, T., Nicolau., H, Jorge, J., and Gonçalves, D. BrailleType: Unleashing Braille over touch screen mobile phones. INTERACT '11. Heidelberg: Springer (2011), 100--107.
[22]
Oliveira, J., Guerreiro, T., Nicolau, H., Jorge, J., and Gonçalves, D. Blind people and mobile touch-based text-entry: acknowledging the need for different flavors. Proc. ASSETS '11. New York: ACM Press (2011), 179--186.
[23]
Ruamviboonsuk, V., Azenkot, S., and Ladner, R. E. Tapulator: a non-visual calculator using natural prefix-free codes. Proc. ASSETS '11. New York: ACM Press (2011), 221--222.
[24]
Silfverberg, M. Using Mobile Keypads with Limited Visual Feedback: Implications to Handheld and Wearable Devices. (2003), 76--90.
[25]
Southern, C., Clawson, J., Frey, B., Abowd, G., and Romero, M. An evaluation of BrailleTouch: mobile touchscreen text entry for the visually impaired. Proc. MobileHCI'12. New York, NY: ACM Press (2012), 317--326.
[26]
Stein, D. Stop, Look, and Listen: Quiet Vehicles and Pedestrian Safety. The Braille Monitor. National Federation of the Blind (2005). https://nfb.org/images/
[27]
nfb/publications/bm/bm05/bm0506/bm050605.htm. Accessed June 18, 2013.
[28]
Story, M. F. (1998). Maximizing usability: the principles of universal design. Assistive technology, 10(1), 4--12.
[29]
Strothotte, T., Fritz, S., Michel, R., et al. Development of dialogue systems for a mobility aid for blind people. Proc. ASSETS'96, New York, NY: ACM Press (1996), 139--144.
[30]
Tinwala, H., and MacKenzie, I. S. Eyes-free text entry on a touchscreen phone. TIC-STH '09 IEEE Toronto International Conference (pp. 83--88).
[31]
Wobbrock, J.O. Measures of text entry performance. In Text Entry Systems: Mobility, Accessibility, Universality, I. S. MacKenzie and K. Tanaka-Ishii (eds.). San Francisco: Morgan Kaufmann (2007), 47--74.
[32]
Wobbrock, J.O. and Myers, B.A. (2006). Analyzing the input stream for character-level errors in unconstrained text entry evaluations. TOCHI 13 (4), 458--489.

Cited By

View all
  • (2024)Integrated Calculators: Moving Calculation into the WorldProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661523(343-355)Online publication date: 1-Jul-2024
  • (2023)SwingBoard: introducing swipe based virtual keyboard for visually impaired and blind usersDisability and Rehabilitation: Assistive Technology10.1080/17483107.2023.219979319:4(1482-1493)Online publication date: 25-Apr-2023
  • (2022)VoLearnProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35345766:2(1-26)Online publication date: 7-Jul-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
UIST '13: Proceedings of the 26th annual ACM symposium on User interface software and technology
October 2013
558 pages
ISBN:9781450322683
DOI:10.1145/2501988
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 October 2013

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. blind
  2. eyes-free text entry
  3. mobile devices.
  4. touchscreen

Qualifiers

  • Research-article

Conference

UIST'13
UIST'13: The 26th Annual ACM Symposium on User Interface Software and Technology
October 8 - 11, 2013
St. Andrews, Scotland, United Kingdom

Acceptance Rates

UIST '13 Paper Acceptance Rate 62 of 317 submissions, 20%;
Overall Acceptance Rate 561 of 2,567 submissions, 22%

Upcoming Conference

UIST '25
The 38th Annual ACM Symposium on User Interface Software and Technology
September 28 - October 1, 2025
Busan , Republic of Korea

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)31
  • Downloads (Last 6 weeks)1
Reflects downloads up to 14 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Integrated Calculators: Moving Calculation into the WorldProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661523(343-355)Online publication date: 1-Jul-2024
  • (2023)SwingBoard: introducing swipe based virtual keyboard for visually impaired and blind usersDisability and Rehabilitation: Assistive Technology10.1080/17483107.2023.219979319:4(1482-1493)Online publication date: 25-Apr-2023
  • (2022)VoLearnProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35345766:2(1-26)Online publication date: 7-Jul-2022
  • (2022)TapCAPTCHA: non-visual CAPTCHA on touchscreens for visually impaired peopleJournal on Multimodal User Interfaces10.1007/s12193-022-00394-216:4(385-398)Online publication date: 31-Oct-2022
  • (2022)Digital Authentication for Visually Disabled People: Initial Results of an Online SurveyComputers Helping People with Special Needs10.1007/978-3-031-08645-8_6(41-50)Online publication date: 11-Jul-2022
  • (2021)Automated repair of size-based inaccessibility issues in mobile applicationsProceedings of the 36th IEEE/ACM International Conference on Automated Software Engineering10.1109/ASE51524.2021.9678625(730-742)Online publication date: 15-Nov-2021
  • (2020)Emoji Accessibility for Visually Impaired PeopleProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376267(1-14)Online publication date: 21-Apr-2020
  • (2020)QB-Gest: Qwerty Bimanual Gestural Input for Eyes-Free Smartphone Text InputUniversal Access in Human-Computer Interaction. Design Approaches and Supporting Technologies10.1007/978-3-030-49282-3_16(223-242)Online publication date: 10-Jul-2020
  • (2019)Demonstration of GestureCalcProceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3308561.3354595(667-669)Online publication date: 24-Oct-2019
  • (2019)GestureCalcProceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3308561.3353783(112-123)Online publication date: 24-Oct-2019
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media