Skip to main content

CrowdAE: A Crowdsourcing System with Human Inspection Quality Enhancement for Web Accessibility Evaluation

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 10896))

Abstract

Crowdsourcing technology can help manual testing by soliciting the contributions from volunteer evaluators. But crowd evaluators may give inaccurate or invalid evaluation results. This paper proposes an advanced crowdsourcing-based web accessibility evaluation system called CrowdAE by enhancing the crowdsourcing-based manual testing module of the previous version. Through three main process namely learning system, task assignment and task review, we can improve the quality of evaluation results from the crowd. From the comparison on the two years’ evaluation process of Chinese government websites, our CrowdAE outperforms the previous version and improve the accuracy of the evaluation results.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Sullivan, T., Matson, R.: Barriers to use: usability and content accessibility on the Web’s most popular websites. In: ACM Conference on Universal Usability, pp. 139–144 (2000)

    Google Scholar 

  2. Abou-Zahra, S.: Web accessibility evaluation. In: Harper, S., Yesilada, Y. (eds.) Web Accessibility. Human-Computer Interaction Series, 1st edn, pp. 79–106. Springer, London (2008). https://doi.org/10.1007/978-1-84800-050-6_7

    Chapter  Google Scholar 

  3. Brajnik, G., Yesilada, Y., Harper, S.: Is accessibility conformance an elusive property? A study of validity and reliability of WCAG 2.0. ACM Trans. Access. Comput. 4(2), 8 (2012)

    Article  Google Scholar 

  4. Brajnik, G., Yesilada, Y., Harper, S.: The expertise effect on web accessibility evaluation methods. Hum.-Comput. Interact. 26(3), 246–283 (2011)

    Google Scholar 

  5. Lujan-Mora, S., Navarrete, R., Penafiel, M.: E-government and web accessibility in South America. In: First International Conference on E-democracy & E-government, pp. 77–82 (2014)

    Google Scholar 

  6. Akgül, Y., Vatansever, K.: Web accessibility evaluation of government websites for people with disabilities in Turkey. J. Adv. Manag. Sci. 4, 201–210 (2016)

    Article  Google Scholar 

  7. Brajnik, G.: A comparative test of Web accessibility evaluation methods. In: Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 113–120 (2008)

    Google Scholar 

Download references

Acknowledgement

This work is supported by the National Natural Science Foundation of China (Nos. 61173185 and 61173186), the National Key Technology R&D Program of China (No. 2012BAI34B01 and 2014BAK15B02), and the Hangzhou S&T Development Plan (No. 20150834M22).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jiajun Bu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, L. et al. (2018). CrowdAE: A Crowdsourcing System with Human Inspection Quality Enhancement for Web Accessibility Evaluation. In: Miesenberger, K., Kouroupetroglou, G. (eds) Computers Helping People with Special Needs. ICCHP 2018. Lecture Notes in Computer Science(), vol 10896. Springer, Cham. https://doi.org/10.1007/978-3-319-94277-3_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-94277-3_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-94276-6

  • Online ISBN: 978-3-319-94277-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics