Skip to main content

Nonvisual Presentation, Navigation and Manipulation of Structured Documents on Mobile and Wearable Devices

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 8547))

Abstract

There are a large number of highly structured documents, for example: newspaper articles, scientific, mathematical or technical literature. As a result of inductive research with 200 blind and visually impaired participants, a multi-modal user interface for non-visual presentation, navigation and manipulation of structured documents on mobile and wearable devices like smart phones, smart watches or smart tablets has been developed. It enables the user to get a fast overview over the document structure and to efficiently skim and scan over the document content by identifying the type, level, position, length, relationship and content text of each element as well as to focus, select, activate, move, remove and insert structure elements or text. These interactions are presented in a non-visual way using earcons, tactons and speech synthesis, serving the aural and tactile human sense. Navigation and manipulation is provided by using the multitouch, motion (linear acceleration and rotation) or speech recognition input modality. It is a complete solution for reading, creating and editing structured documents in a non-visual way. There is no special hardware required. For the development, testing and evaluation of the user interface, a flexible platform independent software architecture has been developed and implemented for iOS and Android. The evaluation of the user interface has been undertaken by a structured observation of 160 blind and visually impaired participants using an implemented software (App) over the Internet.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Apple, VoiceOver, http://www.apple.com/accessibility/voiceover

  2. Google TalkBack: An Open Source Screenreader for Android, http://www.google.com/accessibility/products/

  3. Freedom Scientific, JAWS for Windows Screen Reading Software, http://www.freedomscientific.com/products/fs/jaws-product-page.asp

  4. Minatani, K.: Development of a DAISY Player that Utilizes a Braille Display for Document Structure Presentation and Navigation. In: Miesenberger, K., Karshmer, A., Penaz, P., Zagler, W. (eds.) ICCHP 2012, Part I. LNCS, vol. 7382, pp. 515–522. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  5. Petit, G., Dufresne, A., Robert, J.M.: Introducing TactoWeb: A Tool to Spatially Explore Web Pages for Users with Visual Impairment. In: Stephanidis, C. (ed.) Universal Access in HCI, Part I, HCII 2011. LNCS, vol. 6765, pp. 276–284. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  6. James, F.: Presenting HTML Structure in Audio: User Satisfaction with Audio Hypertext. In: Frysinger, S., et al. (eds.) Proceedings of ICAD 1996, Palo Alto, pp. 97–103 (1996)

    Google Scholar 

  7. Bryman, A.: Social research methods. Oxford University Press, Oxford (2012)

    Google Scholar 

  8. Blattner, M., Sumikawa, D., Greenberg, R.: Earcons and icons: Their structure and common design principles. Human Computer Interaction 4(1), 11–44 (1989)

    Article  Google Scholar 

  9. Brewster, S.A., Wright, P.C., Edwards, A.D.: An evaluation of earcons for use in auditory human-computer interfaces. In: Arnold, B., et al. (eds.) INTERCHI 1993: Proceedings of INTERACT 1993 and CHI 1993, pp. 222–227. ACM, New York (1993)

    Google Scholar 

  10. Brewster, S.A., Wright, P.C., Edwards, A.D.: Parallel earcons: Reducing the length of audio messages. International Journal of Human-Computer Studies 43(22), 153–175 (1995)

    Article  Google Scholar 

  11. Brewster, S., Brown, L.M.: Tactons: Structured Tactile Messages for Non-Visual Information Display. In: Cockburn, A. (ed.) AUIC 2004 - Proceedings of AUIC 2004, vol. 28, pp. 15–23. Australian Computer Society, Darlinghurst (2004)

    Google Scholar 

  12. W3C - Document Object Model (DOM) Level 3 Core Specification, http://www.w3.org/TR/2004/REC-DOM-Level-3-Core-20040407/

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Dorigo, M.L., Harriehausen-Mühlbauer, B., Stengel, I., Dowland, P. (2014). Nonvisual Presentation, Navigation and Manipulation of Structured Documents on Mobile and Wearable Devices. In: Miesenberger, K., Fels, D., Archambault, D., Peňáz, P., Zagler, W. (eds) Computers Helping People with Special Needs. ICCHP 2014. Lecture Notes in Computer Science, vol 8547. Springer, Cham. https://doi.org/10.1007/978-3-319-08596-8_59

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-08596-8_59

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-08595-1

  • Online ISBN: 978-3-319-08596-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics