Abstract
There are a large number of highly structured documents, for example: newspaper articles, scientific, mathematical or technical literature. As a result of inductive research with 200 blind and visually impaired participants, a multi-modal user interface for non-visual presentation, navigation and manipulation of structured documents on mobile and wearable devices like smart phones, smart watches or smart tablets has been developed. It enables the user to get a fast overview over the document structure and to efficiently skim and scan over the document content by identifying the type, level, position, length, relationship and content text of each element as well as to focus, select, activate, move, remove and insert structure elements or text. These interactions are presented in a non-visual way using earcons, tactons and speech synthesis, serving the aural and tactile human sense. Navigation and manipulation is provided by using the multitouch, motion (linear acceleration and rotation) or speech recognition input modality. It is a complete solution for reading, creating and editing structured documents in a non-visual way. There is no special hardware required. For the development, testing and evaluation of the user interface, a flexible platform independent software architecture has been developed and implemented for iOS and Android. The evaluation of the user interface has been undertaken by a structured observation of 160 blind and visually impaired participants using an implemented software (App) over the Internet.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Apple, VoiceOver, http://www.apple.com/accessibility/voiceover
Google TalkBack: An Open Source Screenreader for Android, http://www.google.com/accessibility/products/
Freedom Scientific, JAWS for Windows Screen Reading Software, http://www.freedomscientific.com/products/fs/jaws-product-page.asp
Minatani, K.: Development of a DAISY Player that Utilizes a Braille Display for Document Structure Presentation and Navigation. In: Miesenberger, K., Karshmer, A., Penaz, P., Zagler, W. (eds.) ICCHP 2012, Part I. LNCS, vol. 7382, pp. 515–522. Springer, Heidelberg (2012)
Petit, G., Dufresne, A., Robert, J.M.: Introducing TactoWeb: A Tool to Spatially Explore Web Pages for Users with Visual Impairment. In: Stephanidis, C. (ed.) Universal Access in HCI, Part I, HCII 2011. LNCS, vol. 6765, pp. 276–284. Springer, Heidelberg (2011)
James, F.: Presenting HTML Structure in Audio: User Satisfaction with Audio Hypertext. In: Frysinger, S., et al. (eds.) Proceedings of ICAD 1996, Palo Alto, pp. 97–103 (1996)
Bryman, A.: Social research methods. Oxford University Press, Oxford (2012)
Blattner, M., Sumikawa, D., Greenberg, R.: Earcons and icons: Their structure and common design principles. Human Computer Interaction 4(1), 11–44 (1989)
Brewster, S.A., Wright, P.C., Edwards, A.D.: An evaluation of earcons for use in auditory human-computer interfaces. In: Arnold, B., et al. (eds.) INTERCHI 1993: Proceedings of INTERACT 1993 and CHI 1993, pp. 222–227. ACM, New York (1993)
Brewster, S.A., Wright, P.C., Edwards, A.D.: Parallel earcons: Reducing the length of audio messages. International Journal of Human-Computer Studies 43(22), 153–175 (1995)
Brewster, S., Brown, L.M.: Tactons: Structured Tactile Messages for Non-Visual Information Display. In: Cockburn, A. (ed.) AUIC 2004 - Proceedings of AUIC 2004, vol. 28, pp. 15–23. Australian Computer Society, Darlinghurst (2004)
W3C - Document Object Model (DOM) Level 3 Core Specification, http://www.w3.org/TR/2004/REC-DOM-Level-3-Core-20040407/
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Dorigo, M.L., Harriehausen-Mühlbauer, B., Stengel, I., Dowland, P. (2014). Nonvisual Presentation, Navigation and Manipulation of Structured Documents on Mobile and Wearable Devices. In: Miesenberger, K., Fels, D., Archambault, D., Peňáz, P., Zagler, W. (eds) Computers Helping People with Special Needs. ICCHP 2014. Lecture Notes in Computer Science, vol 8547. Springer, Cham. https://doi.org/10.1007/978-3-319-08596-8_59
Download citation
DOI: https://doi.org/10.1007/978-3-319-08596-8_59
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-08595-1
Online ISBN: 978-3-319-08596-8
eBook Packages: Computer ScienceComputer Science (R0)