Abstract
Although a large amount of research has been conducted on building interfaces for the visually impaired that allows users to read web pages and generate and access information on computers, little development addresses two problems faced by the blind users. First, sighted users can rapidly browse and select information they find useful, and second, sighted users can make much useful information portable through the recent proliferation of personal digital assistants (PDAs). These possibilities are not currently available for blind users. This paper describes an interface that has been built on a standard PDA and allows its user to browse the information stored on it through a combination of screen touches coupled with auditory feedback. The system also supports the storage and management of personal information so that addresses, music, directions, and other supportive information can be readily created and then accessed anytime and anywhere by the PDA user. The paper describes the system along with the related design choices and design rationale. A user study is also reported.
Similar content being viewed by others
References
American Foundation for the Blind (2001) Quick facts and figures on blindness and low vision. http://www.afb.org/info_document_view.asp?documentid=1374
Asakawa C, Itoh T (1998) User interface of a home page reader. In: Proceedings of the 3rd ACM international conference on assistive technologies (ASSETS), Marina Del Rey, October 1998, pp 149–156
Bellik Y, Burger D (1994) Multimodal interfaces: new solutions to the problem of computer accessibility for the blind. In: Conference companion, CHI’94, 24–28 April 1994, pp 267–268
Blattner M, Sumikawa D, Greenberg R (1989) Earcons and icons: their structure and common design principles. Hum Comput Interact 4(1):11–44
Bregman AS (1990) Auditory scene analysis: the perceptual organization of sound. MIT, Cambridge
Brewster SA (1994) Providing a structured method for integrating non-speech audio into human–computer interfaces. Ph.D. thesis, University of York
Brewster SA (1998) Using nonspeech sounds to provide navigation cues. ACM Trans Comput Hum Interact 5(3):224–259
Brewster SA, Lumsden J, Bell M, Hall M, Tasker S (2003) Multimodal ‘eyes-free’ interaction techniques for wearable devices. In: Proceedings of Chi 2003, 5–10 April 2003, Ft Lauderdale, pp. 473–480
Crispien K, Würz W, Weber G (1994) Using spatial audio for the enhanced presentation of synthesized speech within screen-readers for blind computer users. In: Proceedings of international conference on computers for handicapped persons (ICCHP’94), Vienna, Austria, pp. 144–153
Dolphin Group (2004) Dolphin developed a screen reader for PDAs. http://www.dolphincomputeraccess.com/news/2004/hal_pda.htm (Last accessed on March 10, 2005)
Dolphin Group, Hal. http://www.dolphincomputeraccess.com/
Dufresne A, Martial O, Ramstein Ch (1995) Multimodal user interaction system for blind and “visually occupied” users: ergonomic evaluation of the haptic and auditive dimensions. Interact’95, human–computer interaction. Chapman & Hall, pp. 163–168
Eclipse Platform. http://www.eclipse.org/platform/index.html
Elan Speech. http://www.elan.fr/
Freedom Scientific, JAWS. http://www.freedomscientific.com/fs_products/software_jaws.asp
Freedom Scientific, Braille n Speak, PAC Mate, etc. http://www.freedomscientific.com/fs_products/hardware.asp
Friedlander N, Schlueter K, Mantei M (1998) Bullseye! when Fitt’s law doesn’t fit. In: Proceedings of CHI’98. ACM, Addison-Wesley, Los Angeles, pp 257–264
Gaver WW (1994) Using and creating auditory icons. In: Kramer G (ed) Auditory display, SFI Proc. Vol. XVIII. Addison-Wesley, Reading
Goldstein M, Book R, Alsio G, Tessa S (1999) Non-keyboard QWERTY touch typing: a portable input interface for the mobile user. In: Proceedings of the CHI 99, Pittsburg
GW Micro, Inc., Window eyes. http://www.gwmicro.com/products
Karshmer AI, Gupta G, Pontelli E, Miesenberger K, Ammalai N, Gopal D, Batusic M, Stoger B, Palmer B, Guo H-F (2004) UMA: a system for universal mathematics accessibility. In: Proceedings of the 6th international ACM SIGACCESS conference on computers and accessibility (ASSETS’04), Atlanta, 18–20 October 2004, pp. 55–62
Microsoft Research, IRIT (2004) VoCal: non visual interaction on PDA. http://www.irit.fr/diamant/Projets/PDA/index.php, September 2004
Morley S, Petrie H, O’Neill AM, McNally P (1998) Auditory navigation in hyperspace: design and evaluation of a non-visual hypermedia system for blind users. In: Proceedings of the 3rd ACM international conference on assistive technologies (ASSETS), Marina Del Rey, October 1998, pp. 100–107
Mynatt ED (1994) Auditory presentation of graphical user interfaces. Addison-Wesley, Reading
Mynatt ED, Weber G (1994) Nonvisual presentation of graphical user interfaces: contrasting two approaches. In: Proceedings of the SIGCHI conference on human factors in computing systems (CHI’94). ACM, New York, pp. 166–172
OPENNETCF.org (2003) Multimedia.Audio Library. http://www.opennetcf.org/multimedia.asp
Oriola B, Vigouroux N, Decorét C (1996) Voice recognition and keyboard as interaction inputs for blind people: analysis of users’ behaviour. In: Proceedings of international conference on computers for handicapped persons (ICCHP’96). pp. 731–739
Parente P (2004) Audio enriched links: web page previews for blind users. In: Proceedings of the 6th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘04), Atlanta, 18–20 October 2004, pp. 2–7
Pulse Data International, Voice note and Braille note. http://www.pulsedata.com
Roth P, Petrucci L, Assimacopoulos A, Pun T (1998) AB-Web: active audio browser for visually impaired and blind users. In: ICAD’98 proceedings, November 1998
Roth P, Petrucci L, Assimacopoulos A, Pun T (2000) Audio-haptic internet browser and associated tools for blind and visually impaired computer users, Workshop on friendly exchanging through the net, 22–24 March 2000
Smith AC, Francioni JM, Anwar M, Cook JS, Hossain A, Rahman M (2004) Nonvisual tool for navigating hierarchical structures. In: Proceedings of the 6th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘04), Atlanta, 18–20 October 2004, pp. 133–139
Sun Microsystems Laboratories Speech Team, FreeTTS 1.2beta2—a speech synthesizer written entirely in the Java™ programming language. http://www.freetts.sourceforge.net/docs/index.php
Thatcher J (1994) Screen reader/2: access to OS/2 and the graphical user interface. In: Proceedings of the 1st ACM international conference on assistive technologies (ASSETS), Marina Del Rey, October 1994, pp. 39–46
The Sphinx Group, Carnegie Mellon University, The CMU Sphinx group open source speech recognition engines. http://www.cmusphinx.sourceforge.net/html/cmusphinx.php
The TCTS Lab, Faculté Polytechnique de Mons (Belgium), The MBROLA project: towards a freely available multilingual speech synthesizer. http://www.tcts.fpms.ac.be/synthesis/mbrola.html
Truillet P, Oriola B, Vigouroux N (1997) Multimodal presentation as a solution to access a structured document, poster. In: Sixth World Wide Web conference, April 1997, Santa Clara. http://www.ra.ethz.ch/CDstore/www6/Posters/758/758_POST.HTM
Vigouroux N, Oriola B (1994) Multimodal concept for a new generation of screen reader. In: Proceedings of international conference on computers for handicapped persons (ICCHP’94), Vienna, Austria, pp. 154–161
Vigouroux N, Seiler FP, Oriola B, Truillet P (1995) SMART—system for multimodal and multilingual access, reading and retrieval for electronic documents. In: Second TIDE congress, Paris, 26–28 April 1995
Williams C, Tremaine M (2001) SoundNews: an audio browsing tool for the blind. In: Proceedings of the international conference on universal access in human–computer interaction (UAHCI), August 2001, pp. 1029–1033
Wobbrock JO, Myers BA, Kembel JA (2003) A stylus-based text entry method designed for high accuracy and stability of motion. In: Proceedings of the ACM symposium on user interface software and technology (UIST’03), Vancouver, British Columbia, November 2003, pp. 61–70
Wobbrock JO, Myers BA, Aung HH, LoPresti EF (2004) Text entry from power wheelchairs: EdgeWrite for Joysticks and touchpads. In: Proceedings of the 6th international ACM SIGACCESS conference on computers and accessibility (ASSETS’04), Atlanta, 18–20 October 2004, pp. 110–117
Yesilada Y, Stevens R, Goble C, Hussein S (2004) Rendering tables in audio: the interaction of structure and reading styles. In: Proceedings of the 6th international ACM SIGACCESS conference on computers and accessibility (ASSETS’04), Atlanta, 18–20 October 2004, pp. 16–23
Zhao H, Plaisant C, Shneiderman B, Duraiswami R (2004) Sonification of geo-referenced data for auditory information seeking: design principle and pilot study. In: Proceeding of the 10th international conference on auditory display, Sydney, Australia, 6–10 July 2004
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Chen, X., Tremaine, M., Lutz, R. et al. AudioBrowser: a mobile browsable information access for the visually impaired. Univ Access Inf Soc 5, 4–22 (2006). https://doi.org/10.1007/s10209-006-0019-y
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10209-006-0019-y