skip to main content
10.1145/1054972.1054979acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

Is your web page accessible?: a comparative study of methods for assessing web page accessibility for the blind

Published:02 April 2005Publication History

ABSTRACT

Web access for users with disabilities is an important goal and challenging problem for web content developers and designers. This paper presents a comparison of different methods for finding accessibility problems affecting users who are blind. Our comparison focuses on techniques that might be of use to Web developers without accessibility experience, a large and important group that represents a major source of inaccessible pages. We compare a laboratory study with blind users to an automated tool, expert review by web designers with and without a screen reader, and remote testing by blind users. Multiple developers, using a screen reader, were most consistently successful at finding most classes of problems, and tended to find about 50% of known problems. Surprisingly, a remote study with blind users was one of the least effective methods. All of the techniques, however, had different, complementary strengths and weaknesses.

References

  1. Bobby Worldwide, http://bobby.watchfire.com/bobby/html/en/index.jsp]]Google ScholarGoogle Scholar
  2. Bureau of the Census, "Survey of Income and Program Participation," 1994--95, http://www.census.go v/hhes/www/disable/dissipp.html]]Google ScholarGoogle Scholar
  3. Clark, J., Building accessible websites, New Riders, IN, 2002.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Cockton, G., Lavery, D. and Woolrych, A., "Inspection-based evaluations," The Human Computer-Interaction Handbook, Jacko, J. A. and Sears, A. (editors), pp. 1118--1138.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Cockton, G. and Woolrych, A., "Understanding inspection methods: Lessons from an assessment of heuristic evaluation," In A. Blandford and J. Vanderdonckt, (Eds.), People & Computers XV, Springer-Verlag, pp. 171--192, 2001.]]Google ScholarGoogle Scholar
  6. Colwell, C. and Petrie, H., "Evaluation of guidelines for designing accessible web content," In C. Bühler & H. Knops (Eds), Assistive technology on the threshold of the new millennium, IOS press, 1999.]]Google ScholarGoogle Scholar
  7. Coyne, K. P. and Nielsen, J., "Beyond ALT Text: Making the web easy to use for users with disabilities," Nielsen, Norman Group, October, 2001, Available at: http://www.nngroup.com/reports/accessibility/]]Google ScholarGoogle Scholar
  8. Coyne, K. P. and Nielsen, J., "How to conduct usability evaluations for accessibility: Methodology guidelines for testing websites and intranets with users who use assistive technology," Nielsen Norman Group, October, 2001, Available at: http://www.nngroup.com/reports/accessibility/testing/]]Google ScholarGoogle Scholar
  9. Diaper, D. and Worman, L., "Two falls out of three in automated accessibility assessment of world wide web sites: A-prompt vs. Bobby," In Johnson, P. and Palanque, P. (Eds.), People and Computers XVII, Springer-Verlag.]]Google ScholarGoogle Scholar
  10. Gray, W.D, and Salzman, M.C., "Damaged merchandise? A review of experiments that compare usability evaluation methods," Human-Computer Interaction, 13(3):203--261, 1998.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Hartson, H. R., Andre, T. S. and Williges, R. C., "Criteria for evaluating usability evaluation methods," International Journal of Human Computer Interaction, 13(4):373--410, 2001.]]Google ScholarGoogle ScholarCross RefCross Ref
  12. Hertzum, M. and Jacobsen, N. E., "The Evaluator Effect: A Chilling Fact about Usability Evaluation Methods," International Journal of Human Computer Interaction, 13(4):412--443, 2001.]]Google ScholarGoogle ScholarCross RefCross Ref
  13. Ivory, M. and Chevalier, A., "A Study of Automated Web Site Evaluation Tools," Technical Report UW-CSE-02-10-01, University of Washington, Department of Computer Science and Engineering, 2002.]]Google ScholarGoogle Scholar
  14. LIFT, http://www.usablenet.com/]]Google ScholarGoogle Scholar
  15. National Center for Health Statistics, "National health interview survey - disability supplement," 1994 and 1995.]]Google ScholarGoogle Scholar
  16. Nielsen, J. and Molich, R., "Heuristic Evaluation of User Interfaces," In Proc. of CHI'90, pp. 249--256, ACM Press, 1990.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Paciello, M. G., Web accessibility for people with disabilities, CMP Books, KA, 2000.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Petrie, H. and Colwell, C., "Tool to assist authors in creating accessible web pages," In Proc. of NTEVH 98: Telematics in the education of the visually handicapped, 1998. http://www.snv.jussieu.fr/inova/publi/ntevh/tools.htm]]Google ScholarGoogle Scholar
  19. Rowan, M., et al., "Evaluating web resources for disability access," In Proc. of ASSETS '00, pp. 80--84, ACM Press, 2000.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Sears, A., "Heuristic Walkthroughs: Finding the problems without the noise," International Journal of Human-Computer Interaction, 9:213--234.]]Google ScholarGoogle ScholarCross RefCross Ref
  21. Shneiderman, B. and Plaisant, C., Designing the user interface, 4th Edition, Pearson Education.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Sloan, D. et al., "Accessible accessibility," In Proc. of CUU'00, pp.96--101, ACM Press, 2000.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Sullivan, T. and Matson, R., "Barriers to use: Usability and content accessibility on the web's most popular sites," In Proc. of CUU'00, pp. 139--144, ACM Press, 2000.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Thatcher, J. et al., Accessible web sites, Springer-Verlag, NY, 2002.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Woolrych, A. and Cockton, G., "Assessing Heuristic Evaluation: Mind the quality, not just the percentages," in Proc. of HCI 2000, pp. 35--36, 2000.]]Google ScholarGoogle Scholar
  26. W3C Markup Validation Service, http://validator.w3.org/]]Google ScholarGoogle Scholar
  27. W3C Web Content Accessibility Guidelines 1.0, http://www.w3.org/TR/WAI-WEBCONTENT]]Google ScholarGoogle Scholar

Index Terms

  1. Is your web page accessible?: a comparative study of methods for assessing web page accessibility for the blind

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in
            • Published in

              cover image ACM Conferences
              CHI '05: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
              April 2005
              928 pages
              ISBN:1581139985
              DOI:10.1145/1054972

              Copyright © 2005 ACM

              Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              • Published: 2 April 2005

              Permissions

              Request permissions about this article.

              Request Permissions

              Check for updates

              Qualifiers

              • Article

              Acceptance Rates

              CHI '05 Paper Acceptance Rate93of372submissions,25%Overall Acceptance Rate6,199of26,314submissions,24%

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader