skip to main content
10.1145/3597638.3614474acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
poster

Starting well on design for accessibility: analysis of W3C's 167 accessibility evaluation tools for the design phase

Published:22 October 2023Publication History

ABSTRACT

Accessibility can be overlooked in the Design-phase in creating digital products. This can lead to increased costs when problems are discovered later in product development or deployment. Our work aimed to discover how well the 167 W3C accessibility evaluation tools support the Design-phase. Using Grounded Theory, we identified key characteristics of the tools and their support. We found that just 30 (18%) of the tools support the Design-phase; by contrast, 128 (76.5%) support the Later-phases. Of the 30 tools supporting the Design-phase, 25 (83%) support color checks but few support the other W3C basic design considerations. Our key contributions are: (1) our comprehensive study of the 167 W3C accessibility evaluation tools; (2) our insights about their support for the Design-phase; (3) recommendations for improved support for accessibility in the Design-phase.

References

  1. Julio Abascal, Myriam Arrue, and Xabier Valencia. 2019. Tools for web accessibility evaluation. Web accessibility: a foundation for research (2019), 479–503.Google ScholarGoogle Scholar
  2. Georgios Akritidis and Christos Katsanos. 2021. Effect of Potential Issues Flagged by Automated Tools on Web Accessibility Evaluation Results: A Case Study on University Department Websites. In 25th Pan-Hellenic Conference on Informatics. 113–117.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Rafael Almeida and Carlos Duarte. 2020. Analysis of automated contrast checking tools. In Proceedings of the 17th International Web for All Conference. 1–4.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Abdullah Alsaeedi. 2020. Comparing web accessibility evaluation tools and evaluating the accessibility of webpages: proposed frameworks. Information 11, 1 (2020), 40.Google ScholarGoogle ScholarCross RefCross Ref
  5. Kewalin Angkananon, Mike Wald, and Piyabud Ploadaksorn. 2020. Development and testing of a thai website accessibility evaluation tool.International Journal of Electrical & Computer Engineering (2088-8708) 10, 5 (2020).Google ScholarGoogle Scholar
  6. Shiri Azenkot, Margot J Hanley, and Catherine M Baker. 2021. How Accessibility Practitioners Promote the Creation of Accessible Products in Large Companies. Proceedings of the ACM on Human-Computer Interaction 5, CSCW1 (2021), 1–27.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Giovanna Broccia, Marco Manca, Fabio Paternò, and Francesca Pulina. 2020. Flexible automatic support for web accessibility validation. Proceedings of the ACM on Human-Computer Interaction 4, EICS (2020), 1–24.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Andreas Burkard, Gottfried Zimmermann, and Bettina Schwarzer. 2021. Monitoring systems for checking websites on accessibility. Frontiers in Computer Science 3 (2021), 628770.Google ScholarGoogle ScholarCross RefCross Ref
  9. João Dias, Diana Carvalho, Hugo Paredes, Paulo Martins, Tânia Rocha, and João Barroso. 2022. Automated Evaluation Tools for Web and Mobile Accessibility: A Systematic Literature Review. In Innovations in Bio-Inspired Computing and Applications: Proceedings of the 12th International Conference on Innovations in Bio-Inspired Computing and Applications (IBICA 2021) Held During December 16–18, 2021. Springer, 447–456.Google ScholarGoogle Scholar
  10. João Dias, Diana Carvalho, Arsénio Reis, João Barroso, and Tania Rocha. 2022. Identifying Concerns On Automatic Tools For Accessibility Assessment: A Comparative Case Study On Two Portuguese Universities’ Websites. In 2022 5th International Conference on Information and Computer Technologies (ICICT). IEEE, 1–6.Google ScholarGoogle Scholar
  11. Tânia Frazão and Carlos Duarte. 2020. Comparing accessibility evaluation plug-ins. In Proceedings of the 17th International Web for All Conference. Association for Computing Machinery, USA, 1–11.Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Samine Hadadi. 2021. Adee: Bringing Accessibility Right Inside Design Tools. In Proceedings of the 23rd International ACM SIGACCESS Conference on Computers and Accessibility. 1–4.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Travis Hartin. 2017. Understanding Grounded Theory. Eclipse Foundation. Retrieved January 18, 2023 from https://study.com/academy/lesson/grounded-theory-design-definition-advantages-disadvantages.html#: :text=There%20are%20several%20advantages%20of,a%20great%20deal%20of%20data.Google ScholarGoogle Scholar
  14. Rita Ismailova and Yavuz Inal. 2022. Comparison of online accessibility evaluation tools: an analysis of tool effectiveness. IEEE Access 10 (2022), 58233–58239.Google ScholarGoogle ScholarCross RefCross Ref
  15. MLIS Jessica Dawn Brown. 2021. Analysis of Free Browser-based Accessibility Tools: WCAG 2.1 Evaluation of Mississippi Gulf Coast Public Library Websites. SLIS Connecting 10, 2 (2021), 101.Google ScholarGoogle Scholar
  16. Peter Johnson and Mariana Lilley. 2022. Software Prototype for the Ensemble of Automated Accessibility Evaluation Tools. In HCI International 2022 Posters: 24th International Conference on Human-Computer Interaction, HCII 2022, Virtual Event, June 26–July 1, 2022, Proceedings, Part I. Springer, 532–539.Google ScholarGoogle ScholarCross RefCross Ref
  17. Eryn Rachael Kelsey-Adkins and Robert Haven Thompson. 2022. Inter-rater Reliability of Command-Line Web Accessibility Evaluation Tools. In Proceedings of the 24th International ACM SIGACCESS Conference on Computers and Accessibility. 1–4.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Urvashi Kokate, Kristen Shinohara, and Garreth W Tigwell. 2022. Exploring Accessibility Features and Plug-ins for Digital Prototyping Tools. In Proceedings of the 24th International ACM SIGACCESS Conference on Computers and Accessibility. 1–4.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Shashank Kumar, JeevithaShree DV, and Pradipta Biswas. 2020. Accessibility evaluation of websites using WCAG tools and Cambridge Simulator. arXiv e-prints, Article arXiv:2009.06526 (sep 2020), arXiv:2009.06526 pages. arxiv:2009.06526 [cs.HC]Google ScholarGoogle Scholar
  20. Şevval Seray Macakoğlu and Serhat Peker. 2022. Web accessibility performance analysis using web content accessibility guidelines and automated tools: a systematic literature review. In 2022 International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA). IEEE, 1–8.Google ScholarGoogle ScholarCross RefCross Ref
  21. Kelly Mack, Emma McDonnell, Dhruv Jain, Lucy Lu Wang, Jon E. Froehlich, and Leah Findlater. 2021. What Do We Mean by “Accessibility Research”? A Literature Survey of Accessibility Papers in CHI and ASSETS from 1994 to 2019. Association for Computing Machinery, New York, NY, USA, 1–18. https://doi.org/10.1145/3411764.3445412Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Telcia Niom and Frank Lin. 2022. Accessibility of COVID-19 Websites of Asian Countries: An Evaluation Using Automated Tools. SN Computer Science 3, 6 (2022), 498.Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Obianuju Okafor, Wajdi Aljedaani, and Stephanie Ludi. 2022. Comparative Analysis of Accessibility Testing Tools and Their Limitations in RIAs. In HCI International 2022–Late Breaking Papers: HCI for Health, Well-being, Universal Access and Healthy Aging: 24th International Conference on Human-Computer Interaction, HCII 2022, Virtual Event, June 26–July 1, 2022, Proceedings. Springer, 479–500.Google ScholarGoogle Scholar
  24. Marian Pădure and Costin Pribeanu. 2019. Exploring the differences between five accessibility evaluation tools. In Proc. Int. Conf. Hum.-Comput. Interact.(RoCHI). 87–90.Google ScholarGoogle Scholar
  25. Marian Padure and Costin Pribeanu. 2020. Comparing Six Free Accessibility Evaluation Tools. Informatica Economica 24, 1 (2020), 15–25.Google ScholarGoogle ScholarCross RefCross Ref
  26. Subhajit Panda and Rupak Chakravarty. 2020. Evaluating the web accessibility of IIT libraries: a study of Web Content Accessibility Guidelines. Performance Measurement and Metrics 21, 3 (2020), 121–145.Google ScholarGoogle ScholarCross RefCross Ref
  27. Parvaneh Parvin, Vanessa Palumbo, Marco Manca, and Fabio Paternò. 2021. The transparency of automatic accessibility evaluation tools. In Proceedings of the 18th International Web for All Conference. 1–5.Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Cynthia Putnam, Emma J Rose, and Craig M MacDonald. 2023. “It could be better. It could be much worse”: Understanding Accessibility in User Experience Practice with Implications for Industry and Education. ACM Transactions on Accessible Computing 16, 1 (2023), 1–25.Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Vincenzo Rubano and Fabio Vitali. 2021. Making accessibility accessible: strategy and tools. In 2021 IEEE 18th Annual Consumer Communications & Networking Conference (CCNC). IEEE, 1–6.Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Viktoria Stray, Aleksander Bai, Nikolai Sverdrup, and Heidi Mork. 2019. Empowering agile project members with accessibility testing tools: a case study. In Agile Processes in Software Engineering and Extreme Programming: 20th International Conference, XP 2019, Montréal, QC, Canada, May 21–25, 2019, Proceedings 20. Springer International Publishing, 86–101.Google ScholarGoogle ScholarCross RefCross Ref
  31. W3. 2017. Headings in Page Structure Tutorial. W3. Retrieved January 18, 2023 from https://www.w3.org/WAI/tutorials/page-structure/headings/Google ScholarGoogle Scholar
  32. W3. 2019. Forms tutorial. W3. Retrieved January 18, 2023 from https://www.w3.org/WAI/tutorials/forms/Google ScholarGoogle Scholar
  33. W3C. 2016. Web Accessibility Evaluation Tools List. W3C. Retrieved January 18, 2023 from https://www.w3.org/WAI/ER/tools/Google ScholarGoogle Scholar
  34. WAI. 2019. Tips for Getting Started Designing for Web Accessibility. WAI. Retrieved January, 2023 from https://www.w3.org/WAI/tips/designing/Google ScholarGoogle Scholar
  35. Isabelle Walsh, Judith A Holton, Lotte Bailyn, Walter Fernandez, Natalia Levina, and Barney Glaser. 2015. What grounded theory is. a critically reflective conversation among scholars. Organizational Research Methods 18, 4 (2015), 581–599.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Starting well on design for accessibility: analysis of W3C's 167 accessibility evaluation tools for the design phase
            Index terms have been assigned to the content through auto-classification.

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in
            • Published in

              cover image ACM Conferences
              ASSETS '23: Proceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility
              October 2023
              1163 pages
              ISBN:9798400702204
              DOI:10.1145/3597638

              Copyright © 2023 Owner/Author

              Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              • Published: 22 October 2023

              Check for updates

              Qualifiers

              • poster
              • Research
              • Refereed limited

              Acceptance Rates

              ASSETS '23 Paper Acceptance Rate55of182submissions,30%Overall Acceptance Rate436of1,556submissions,28%
            • Article Metrics

              • Downloads (Last 12 months)144
              • Downloads (Last 6 weeks)26

              Other Metrics

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader

            HTML Format

            View this article in HTML Format .

            View HTML Format