Skip to main content
Log in

Accessibility of audio and tactile interfaces for young blind people performing everyday tasks

  • Long Paper
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

Increasingly, computers are becoming tools of communication, information exploring and studying for young people, regardless of their abilities. Scientists have been building knowledge on how blind people can substitute hearing or touch for sight or how the combination of senses, i.e., multimodalities, can provide the user with an effective way of exploiting the power of computers. Evaluation of such multimodal user interfaces in the right context, i.e., appropriate users, tasks, tools and environment, is essential to give designers accurate feedback on blind users’ needs. This paper presents a study on how young blind people use computers for everyday tasks with the aids of assistive technologies, aiming to understand what hindrances they encounter when interacting with a computer using individual senses, and what supports them. A common assistive technology is a screen reader, producing output to a speech synthesizer or a Braille display. Those two modes are often used together, but the research studied how visually impaired students interact with computers using either form, i.e., a speech synthesizer or a Braille display. A usability test has been performed to assess blind grade-school students’ ability to carry out common tasks with the help of a computer, including solving mathematical problems, navigating the web, communicating with e-mail and using word processing. During the usability tests, students were allowed to use either auditory mode or tactile mode. Although blind users most commonly use a speech synthesizer (audio), the results indicate that this was not always the most suitable modality. While the effectiveness of the Braille display (tactile user interface) to accomplish certain tasks was similar to that of the audio user interface, the users’ satisfaction rate was higher. The contribution of this work lies in answering two research questions by analysing two modes of interaction (tactile and speech), while carrying out tasks of varying genre, i.e., web searching, collaboration through e-mail, word processing and mathematics. A second contribution of this work is the classification of observations into four categories: usability and accessibility, software fault, cognitive mechanism and learning method. Observations, practical recommendations and open research problems are then presented and discussed. This provides a framework for similar studies in the future. A third contribution of this work is the elaboration of practical recommendations for user interface designers and a research agenda for scientists.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. http://www.dolphincomputeraccess.com/products/supernova.htm.

References

  1. Aldrich, F.K., Parkin, A.J.: Tape recorded textbooks for the blind: a survey of producers and users. Br. J. Vis. Impair. 6, 3–6 (1988)

    Article  Google Scholar 

  2. Bevan, N., Macleod, M.: Usability measurement in context. Behav. Inf. Technol. 13, 132–145 (1994)

    Article  Google Scholar 

  3. Boren, M.T., Ramey, J.: Thinking aloud: reconciling theory and practice. IEEE Trans. Prof. Commun. 43, 261–278 (2000)

    Article  Google Scholar 

  4. Chen, X., Tremaine, M., Lutz, R., Chung, J.-w., Lacsina, P.: AudioBrowser: a mobile browsable information access for the visually impaired. Univ. Access Inf. Soc. 5, 4–22 (2006)

    Article  Google Scholar 

  5. Coyne, K.P.: Conducting Simple usability studies with users with disabilities. UAHCI - HCI International Las Vegas (2005)

  6. Coyne, K.P., Nielsen, J.: Beyond ALT text: making the Web easy to use for users with disabilities. Report from the Nielsen Norman group, vol. 2006. Nielsen Norman group (2006)

  7. Craven, J.: Access to electronic resources by visually impaired people. Information Research, vol. 8. Paper no. 156 (2003)

  8. Craven, J., Brophy, P.: Non-visual access to the digital library (NoVA): the user of the digital library interfaces by blind and visually impaired people. Centre for Research in Library and Information management, The Manchester Metropolitan University, Manchester (2003)

    Google Scholar 

  9. Disability Rights Commission: The web - access and inclusion for disabled people. A formal investigation conducted by the Disability Rights Commission. Disability Rights Commission London, 1–56 (2004)

  10. Fänger, J., König, H., Schneider, J., Strothotte, T., Weber, G.: Guided and free haptic exploration of 3D models. In: Vollmar, R., Wagner, R. (eds.) Computers helping people with special needs ICCHP, pp. 745–752. Österreichische Computer Gesellschaft, Wien (2000)

    Google Scholar 

  11. Fisher, G., Lemke, A.C., Mastaglio, T.W., Mørch, A.I.: Critics: an emerging approach to knowledge-based human-computer interaction. Int. J. Man-Mach. Stud. 35, 695–721 (1991)

    Article  Google Scholar 

  12. Fritz, J.P., Barner, K.E.: Design of a haptic data visualization system for people with visual impairments. Rehabil. Eng., IEEE Trans. 7, 372–384 (1999)

    Article  Google Scholar 

  13. Fritz, J.P., Way, T.P., Barner, K.E.: Haptic representation of scientific data for visually impaired or blind persons Proceedings of the Eleventh Annual Technology and Persons with Disabilities Conference. Northridge, Los Angeles (1996)

    Google Scholar 

  14. Gulliksen, J., Boivie, I., Persson, J., Hektor, A., Herulf, L.: Making a difference: a survey of the usability profession in Sweden. Proceedings of the third Nordic conference on Human-computer interaction. ACM, Tampere (2004)

  15. Gunderson, J.: W3C user agent accessibility guidelines 1.0 for graphical Web browsers. Univ. Access Inf. Soc. 3, 38–47 (2004)

    Article  Google Scholar 

  16. International Organization for Standardisation (ISO): Ergonomic requirements for office work with visual display terminals (VDTs)—Part 11: Guidance on usability (1998)

  17. Jeong, W.: Touchable online braille generator. Inf. Technol. Libr. 27, 48–52 (2008)

    Google Scholar 

  18. Leporini, B., Paternó, F.: Increasing usability when interacting through screen readers. Univ. Access Inf. Soc. 3, 57–70 (2004)

    Article  Google Scholar 

  19. Leporini, B., Paternó, F.: Evaluating a modified google user interface via screen reader. Univ. Access Inf. Soc. 2008, 156–175 (2008)

    Google Scholar 

  20. Mankoff, J., Fait, H., Tran, T.: Is your web page accessible?: a comparative study of methods for assessing web page accessibility for the blind. Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, Portland (2005)

  21. McGookin, D., Brewster, S.A.: Graph Builder: Constructing Non-visual Visualizations. British Computer Society, HCI 2006 BCS, London, UK 267–270 (2006)

  22. McGookin, D., Brewster, S.A.: MultiVis: Improving Access to Visualisations for Visually Impaired People. Conference on Computer-Human Interaction, CHI 2006. ACM, Montreal, Québec, Canada 267–270 (2006)

  23. Mousavi, S.Y., Low, R., Sweller, J.: Reducing cognitive load my mixing audiotry and visual presentation modes. J. Educ. Psychol. 87, 319–334 (1995)

    Article  Google Scholar 

  24. Murphy, E., Kuber, R., McAllister, G., Strain, P., Yu, W.: An empirical investigation into the difficulties experienced by visually impaired Internet users. Univers. Access Inf. Soc. 2008, 79–91 (2008)

    Article  Google Scholar 

  25. Oviatt, S.: Ten myths of myths of multimodal interaction. Commun. ACM 42, 74–81 (1999)

    Google Scholar 

  26. Oviatt, S., Darves, C., Coulston, R.: Toward adaptive conversational interfaces: modeling speech convergence with animated personas. ACM Trans. Comput.-Hum. Interact. 11, 300–328 (2004)

    Article  Google Scholar 

  27. Parkin, A.J., Aldrich, F.K.: Improving learning from audiotape: a technique that works!. Br. J. Vis. Impair. 7(2):58–60 (1989)

    Google Scholar 

  28. Powlik, J.J., Karshmer, A.I.: When accessibility meets usability. Univ. Access Inf. Soc. 2002, 217–222 (2002)

    Article  Google Scholar 

  29. Rotard, M., Knödler, S., Ertl, T.: A tactile web browser for the visually disabled. Proceedings of the sixteenth ACM conference on Hypertext and hypermedia. ACM, Salzburg (2005)

  30. Shimamura, Y.: Multimodal user interface design Computer Science, Faculty of Engineering, vol. MS. University of Iceland, Reykjavik 89 (2005)

  31. Statistics Iceland: Purpose of using the Internet, a survey Vol. 2007, Reykjavik (2007)

  32. Sweller, J.: Cognitive load during problem solving: effects on learning. Cogn. Sci. 12, 257–285 (1988)

    Article  Google Scholar 

  33. Sweller, J., Merrienboer, J.J.G., Paas, F.G.W.C.: Cognitive architecture and instructional design. Educ. Psychol. Rev. 10, 251–296 (1998)

    Article  Google Scholar 

  34. Takagi, H., Saito, S., Fukuda, K., Asakawa, C.: Analysis of navigability of Web applications for improving blind usability. ACM Trans. Comput.-Hum. Interact. 14, 13 (2007)

    Article  Google Scholar 

  35. Theofanos, M.F., Redish, J.: Guidelines for accessible- and usable-websites: observing users who work with screenreaders Interactions, Vol. November/December 2003 36–51 (2003)

  36. Tindall-Ford, S., Chandler, P., Sweller, J.: When two sensory modes are better than one. J. Exp. Psychol.:Appl. 3, 257–287 (1997)

    Article  Google Scholar 

  37. Williamson, K., Schauder, D., Bow, A.: Information seeking by blind and sight impaired citizens: an ecological study. Information Research, Vol. 4 (2000)

  38. Yesilada, Y., Harper, S., Goble, C., Stevens, R.: Screen readers cannot see ontology based semantic annotation for visually impaired web travellers. In: Koch, N., Fraternali, P., Wirsing, M. (eds.) Web Engineering 4th International Conference, ICWE 2004, Munich, Germany, July 26–30, 2004. Proceedings. Springer-Verlag Munich, Germany (2004)

  39. Yesilada, Y., Stevens, R., Harper, S., Goble, C.: Evaluating DANTE: semantic transcoding for visually disabled users. ACM Trans. Comput.-Hum. Interact. 14, 14 (2007)

    Article  Google Scholar 

  40. Yu, W., Kuber, R., Murphy, E., Strain, P., McAllister, G.: A novel multimodal interface for improving visually impaired people’s web accessibility. Virtual Real. 2006, 133–148 (2006)

    Article  Google Scholar 

Download references

Acknowledgements

The authors wish to thank the participants in the research study for their valuable time. The authors also wish to thank the reviewers for their constructive criticism during the review. The work was funded by the Icelandic Research Fund, RANNIS.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ebba Thora Hvannberg.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Shimomura, Y., Hvannberg, E.T. & Hafsteinsson, H. Accessibility of audio and tactile interfaces for young blind people performing everyday tasks. Univ Access Inf Soc 9, 297–310 (2010). https://doi.org/10.1007/s10209-009-0183-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-009-0183-y

Keywords

Navigation