Abstract
Crowdsourcing technology can help manual testing by soliciting the contributions from volunteer evaluators. But crowd evaluators may give inaccurate or invalid evaluation results. This paper proposes an advanced crowdsourcing-based web accessibility evaluation system called CrowdAE by enhancing the crowdsourcing-based manual testing module of the previous version. Through three main process namely learning system, task assignment and task review, we can improve the quality of evaluation results from the crowd. From the comparison on the two years’ evaluation process of Chinese government websites, our CrowdAE outperforms the previous version and improve the accuracy of the evaluation results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Sullivan, T., Matson, R.: Barriers to use: usability and content accessibility on the Web’s most popular websites. In: ACM Conference on Universal Usability, pp. 139–144 (2000)
Abou-Zahra, S.: Web accessibility evaluation. In: Harper, S., Yesilada, Y. (eds.) Web Accessibility. Human-Computer Interaction Series, 1st edn, pp. 79–106. Springer, London (2008). https://doi.org/10.1007/978-1-84800-050-6_7
Brajnik, G., Yesilada, Y., Harper, S.: Is accessibility conformance an elusive property? A study of validity and reliability of WCAG 2.0. ACM Trans. Access. Comput. 4(2), 8 (2012)
Brajnik, G., Yesilada, Y., Harper, S.: The expertise effect on web accessibility evaluation methods. Hum.-Comput. Interact. 26(3), 246–283 (2011)
Lujan-Mora, S., Navarrete, R., Penafiel, M.: E-government and web accessibility in South America. In: First International Conference on E-democracy & E-government, pp. 77–82 (2014)
Akgül, Y., Vatansever, K.: Web accessibility evaluation of government websites for people with disabilities in Turkey. J. Adv. Manag. Sci. 4, 201–210 (2016)
Brajnik, G.: A comparative test of Web accessibility evaluation methods. In: Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 113–120 (2008)
Acknowledgement
This work is supported by the National Natural Science Foundation of China (Nos. 61173185 and 61173186), the National Key Technology R&D Program of China (No. 2012BAI34B01 and 2014BAK15B02), and the Hangzhou S&T Development Plan (No. 20150834M22).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Li, L. et al. (2018). CrowdAE: A Crowdsourcing System with Human Inspection Quality Enhancement for Web Accessibility Evaluation. In: Miesenberger, K., Kouroupetroglou, G. (eds) Computers Helping People with Special Needs. ICCHP 2018. Lecture Notes in Computer Science(), vol 10896. Springer, Cham. https://doi.org/10.1007/978-3-319-94277-3_5
Download citation
DOI: https://doi.org/10.1007/978-3-319-94277-3_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-94276-6
Online ISBN: 978-3-319-94277-3
eBook Packages: Computer ScienceComputer Science (R0)