Skip to main content

End-User Evaluations

  • Chapter
  • First Online:

Part of the book series: Human–Computer Interaction Series ((HCIS))

Abstract

The past few years have seen tremendous development in web technologies. A range of websites and mobile applications have been developed to support a variety of online activities. The ubiquitous nature and increasing complexity of technology mean that ensuring accessibility remains challenging. Accessibility evaluation refers to the process of examining a product and establishing the extent to which it supports accessibility through the identification of potential barriers. While accessibility guidelines can guide the development process and automated evaluation tools can assist in measuring conformance, they do not guarantee that products will be accessible in a live context. The most reliable way to evaluate the accessibility of a product is to conduct a study with representative users interacting with the product. This chapter outlines a range of methods which can be used to ensure that a product is designed to meet the requirements and specific needs of users, from the ideation phase to the design and iterative development. The strengths and weaknesses of each method are described, as well as the primary considerations to ensure that the results of a study are reliable and valid, and also participants are treated ethically. This chapter concludes with a discussion of the field as well as an examination of future trends such as how data from user studies can be used to influence the design of future accessibility guidelines to improve their efficacy.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   229.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   299.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   299.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    https://www.techsmith.com/morae.html.

  2. 2.

    https://en.oxforddictionaries.com/definition/crowdsourcing.

  3. 3.

    http://www.gpower.hhu.de/en.html.

References

  • Albanesi MG, Gatti R, Porta M, Ravarelli A (2011) Towards semi-automatic usability analysis through eye tracking. In: Proceedings of the 12th International Conference on Computer Systems and Technologies, ACM, New York, NY, USA, CompSysTech ’11, pp 135–141. https://doi.org/10.1145/2023607.2023631

  • Asakawa C, Takagi H (2008) Transcoding. In: Harper S, Yesilada Y (eds) Web accessibility, Springer, London, a foundation for research, human computer interaction series, pp 231–260

    Google Scholar 

  • Bailey C, Gkatzidou V (2017) Considerations for implementing a holistic organisational approach to accessibility. In: Proceedings of the 14th Web for All Conference on the Future of Accessible Work, ACM, New York, NY, USA, W4A ’17, pp 7:1–7:4. https://doi.org/10.1145/3058555.3058571

  • Bevan N, Carter J, Harker S (2015) Iso 9241–11 revised: what have we learnt about usability since 1998? In: Kurosu M (ed) Human-computer interaction: design and evaluation. Springer International Publishing, Cham, pp 143–151

    Chapter  Google Scholar 

  • Blascheck T, Kurzhals K, Raschke M, Burch M, Weiskopf D, Ertl T (2017) Visualization of eye tracking data: a taxonomy and survey. Comput Graph Forum 36(8):260–284. https://doi.org/10.1111/cgf.13079

    Article  Google Scholar 

  • Borodin Y, Bigham JP, Dausch G, Ramakrishnan IV (2010) More than meets the eye: a survey of screen-reader browsing strategies. In: Proceedings of the 2010 International Cross Disciplinary Conference on Web Accessibility (W4A), ACM, New York, NY, USA, W4A ’10, pp 13:1–13:10. https://doi.org/10.1145/1805986.1806005

  • Brajnik G (2006) Web accessibility testing: when the method is the culprit. In: Miesenberger K, Klaus J, Zagler WL, Karshmer AI (eds) Computers helping people with special needs. Springer, Berlin, pp 156–163

    Chapter  Google Scholar 

  • Breen RL (2006) A practical guide to focus-group research. J Geogr High Educ 30(3):463–475. https://doi.org/10.1080/03098260600927575

    Article  Google Scholar 

  • Burton MC, Walther JB (2001) The value of web log data in use-based design and testing. J Comput-Mediat Commun 6(3):JCMC635. https://doi.org/10.1111/j.1083-6101.2001.tb00121.x

    Article  Google Scholar 

  • Clegg-Vinell R, Bailey C, Gkatzidou V (2014) Investigating the appropriateness and relevance of mobile web accessibility guidelines. In: Proceedings of the 11th Web for All Conference, ACM, New York, NY, USA, W4A ’14, pp 38:1–38:4. https://doi.org/10.1145/2596695.2596717

  • Cooper M, Sloan D, Kelly B, Lewthwaite S (2012) A challenge to web accessibility metrics and guidelines: putting people and processes first. In: Proceedings of the International Cross-disciplinary Conference on Web Accessibility, ACM, New York, NY, USA, W4A ’12, pp 20:1–20:4. https://doi.org/10.1145/2207016.2207028

  • Dix A, Finlay J, Abowd G, Beale R (2004) Evaluation techniques. In: Human-computer interaction, 3rd edn. Pearson Prentice Hall, pp 318–363

    Google Scholar 

  • Ehmke C, Wilson S (2007) Identifying web usability problems from eye-tracking data. In: Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI...but not as we know it - volume 1, British Computer Society, Swinton, UK, UK, BCS-HCI ’07, pp 119–128

    Google Scholar 

  • Eraslan S, Yesilada Y (2015) Patterns in eyetracking scanpaths and the affecting factors. J Web Eng 14(5–6):363–385

    Google Scholar 

  • Eraslan S, Yesilada Y, Harper S (2013) Understanding eye tracking data for re-engineering web pages. In: Sheng QZ, Kjeldskov J (eds) Current trends in web engineering. Springer International Publishing, Cham, pp 345–349

    Chapter  Google Scholar 

  • Eraslan S, Yesilada Y, Harper S (2014) Identifying patterns in eyetracking scanpaths in terms of visual elements of web pages. In: Casteleyn S, Rossi G, Winckler M (eds) Web engineering. Springer International Publishing, Cham, pp 163–180

    Chapter  Google Scholar 

  • Eraslan S, Yesilada Y, Harper S (2015) Eye tracking scanpath analysis techniques on web pages: a survey, evaluation and comparison. J Eye Mov Res 9(1). https://bop.unibe.ch/JEMR/article/view/2430

  • Eraslan S, Yesilada Y, Harper S (2016a) Eye tracking scanpath analysis on web pages: how many users? In: Proceedings of the ninth biennial ACM symposium on eye tracking research & applications, ACM, New York, NY, USA, ETRA ’16, pp 103–110. https://doi.org/10.1145/2857491.2857519

  • Eraslan S, Yesilada Y, Harper S (2016b) Scanpath trend analysis on web pages: clustering eye tracking scanpaths. ACM Trans Web 10(4):20:1–20:35. https://doi.org/10.1145/2970818

    Article  Google Scholar 

  • Eraslan S, Yesilada Y, Harper S (2016c) Trends in eye tracking scanpaths: segmentation effect? In: Proceedings of the 27th ACM Conference on Hypertext and Social Media, ACM, New York, NY, USA, HT ’16, pp 15–25. https://doi.org/10.1145/2914586.2914591

  • Eraslan S, Yesilada Y, Harper S, Davies A (2016d) What is trending in eye tracking scanpaths on web pages? In: Spink A, Riedel G, Zhou L, Teekens L, Albatal R, Gurrin C (eds) Proceedings of the 10th International Conference on Methods and Techniques in Behavioral Research (Measuring Behavior 2016), Dublin City University, MB 2016, pp 341–343

    Google Scholar 

  • Eraslan S, Yaneva V, Yesilada Y, Harper S (2017a) Do web users with autism experience barriers when searching for information within web pages? In: Proceedings of the 14th Web for All Conference on the Future of Accessible Work, ACM, New York, NY, USA, W4A ’17, pp 20:1–20:4. https://doi.org/10.1145/3058555.3058566

  • Eraslan S, Yesilada Y, Harper S (2017b) Engineering web-based interactive systems: trend analysis in eye tracking scanpaths with a tolerance. In: Proceedings of the ACM SIGCHI symposium on engineering interactive computing systems, ACM, New York, NY, USA, EICS ’17, pp 3–8. https://doi.org/10.1145/3102113.3102116

  • Eraslan S, Yesilada Y, Harper S (2017c) Less users more confidence: how AOis dont affect scanpath trend analysis. J Eye Mov Res 10(4). https://bop.unibe.ch/JEMR/article/view/3882

  • Eraslan S, Yesilada Y, Harper S (2018) Crowdsourcing a corpus of eye tracking data on web pages: a methodology. In: Grant R, Allen T, Spink A, Sullivan M (eds) Proceedings of the 11th International Conference on Methods and Techniques in Behavioral Research (Measuring Behavior 2018), Manchester Metropolitan University, MB2018, pp 267–273

    Google Scholar 

  • Eysenck MW (2005) Psychology for AS level, 3rd edn. Psychology Press, Hove, East Sussex

    Google Scholar 

  • Gay L, Mills G, Airasian P (2009) Educational research: competencies for analysis and applications, 9th edn. Prentice Hall, Upper Saddle River, New Jersey

    Google Scholar 

  • Gravetter FJ, Wallnau LB (2008) Statistics for behavioral sciences, 8th edn. Wadsworth Publishing

    Google Scholar 

  • Hennick M (2007) International focus group research: a handbook for the health and social sciences. Cambridge University Press, Cambridge

    Google Scholar 

  • Henry SL (2018) Involving users in evaluating web accessibility. https://www.w3.org/WAI/test-evaluate/involving-users/. Accessed 15 Aug 2018

  • Hesterberg TC (2015) What teachers should know about the bootstrap: resampling in the undergraduate statistics curriculum. Am Stat 69(4):371–386. https://doi.org/10.1080/00031305.2015.1089789, pMID:27019512

    Article  MathSciNet  Google Scholar 

  • Holzinger A (2005) Usability engineering methods for software developers. Commun ACM 48(1):71–74. https://doi.org/10.1145/1039539.1039541

    Article  Google Scholar 

  • Jay C, Lunn D, Michailidou E (2008) End user evaluations. In: Harper S, Yesilada Y (eds) Web accessibility. A foundation for research, human computer interaction series. Springer, London, pp 107–126

    Google Scholar 

  • Kouroupetroglou C, Koumpis A (2014) Challenges and solutions to crowdsourcing accessibility evaluations. https://www.w3.org/WAI/RD/2014/way-finding/paper5/. Accessed 9 July 2018

  • Kuniavsky M (2003) Observing the User Experience: A Practitioner’s Guide to User Research (Morgan Kaufmann series in interactive technologies). Morgan Kaufmann Publishers Inc., San Francisco

    Google Scholar 

  • Leavitt M, Shneiderman B (2006) Research-based web design and usability guidelines. Department of Health and Human Services, Washington DC, US

    Google Scholar 

  • Leedy P, Ormerod J (2016) Practical research: planning and design, 11th edn. Pearson

    Google Scholar 

  • Lewis C (1982) Using the think aloud method in cognitive interface design. IBM Research Report, RC–9265 (\(\#\)40713), IBM Thomas J. Watson Research Center, Yorktown Heights, NY

    Google Scholar 

  • Li L, Wang C, Song S, Yu Z, Zhou F, Bu J (2017) A task assignment strategy for crowdsourcing-based web accessibility evaluation system. In: Proceedings of the 14th Web for All Conference on the Future of Accessible Work, ACM, New York, NY, USA, W4A ’17, pp 18:1–18:4. https://doi.org/10.1145/3058555.3058573

  • Menges R, Kumar C, Müller D, Sengupta K (2017) Gazetheweb: a gaze-controlled web browser. In: Proceedings of the 14th Web for All Conference on the Future of Accessible Work, ACM, New York, NY, USA, W4A ’17, pp 25:1–25:2. https://doi.org/10.1145/3058555.3058582

  • Nielsen J (2003) Usability 101: introduction to usability. http://www.useit.com/alertbox/20030825.html. Accessed: 09 July 2018

  • Nielsen J (2004) Risks of quantitative studies. https://www.nngroup.com/articles/risks-of-quantitative-studies/. Accessed 01 July 2018

  • Pallant J (2007) SPSS survival manual: a step by step guide to data analysis using SPSS version 15, 4th edn. Open University Press/McGraw-Hill, Maidenhead

    Google Scholar 

  • Poole A, Ball LJ (2005) Eye tracking in human-computer interaction and usability research: current status and future. In: Prospects, Chapter in C. Ghaoui (Ed.): encyclopedia of human-computer interaction. Idea Group Inc., Pennsylvania

    Google Scholar 

  • Rello L, Ballesteros M (2015) Detecting readers with dyslexia using machine learning with eye tracking measures. In: Proceedings of the 12th Web for All Conference, ACM, New York, NY, USA, W4A ’15, pp 16:1–16:8. https://doi.org/10.1145/2745555.2746644

  • Rosenbaum S, Cockton G, Coyne K, Muller M, Rauch T (2002) Focus groups in HCI: wealth of information or waste of resources? In: CHI ’02 extended abstracts on human factors in computing systems, ACM, New York, NY, USA, CHI EA ’02, pp 702–703. https://doi.org/10.1145/506443.506554

  • Rubin J, Chisnell D (2008) Handbook of usability testing: how to plan, design and conduct effective tests. Wiley, New York

    Google Scholar 

  • Sherief N, Jiang N, Hosseini M, Phalp K, Ali R (2014) Crowdsourcing software evaluation. In: Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering, ACM, New York, NY, USA, EASE ’14, pp 19:1–19:4. https://doi.org/10.1145/2601248.2601300

  • Song S, Bu J, Wang Y, Yu Z, Artmeier A, Dai L, Wang C (2018) Web accessibility evaluation in a crowdsourcing-based system with expertise-based decision strategy. In: Proceedings of the internet of accessible things, ACM, New York, NY, USA, W4A ’18, pp 23:1–23:4. https://doi.org/10.1145/3192714.3192827

  • Yaneva V, Ha LA, Eraslan S, Yesilada Y, Mitkov R (2018) Detecting autism based on eye-tracking data from web searching tasks. In: Proceedings of the internet of accessible things, ACM, New York, NY, USA, W4A ’18, pp 16:1–16:10. https://doi.org/10.1145/3192714.3192819

  • Yen PY, Bakken S (2009) A comparison of usability evaluation methods: heuristic evaluation versus end-user think-aloud protocol–an example from a web-based communication tool for nurse scheduling. In: AMIA annual symposium proceedings, American Medical Informatics Association, vol 2009, p 714

    Google Scholar 

  • Yesilada Y, Stevens R, Harper S, Goble C (2007) Evaluating DANTE: Semantic Transcoding for Visually Disabled Users. ACM Trans Comput-Hum Interact 14(3):14. https://doi.org/10.1145/1279700.1279704

    Article  Google Scholar 

  • Yesilada Y, Brajnik G, Vigo M, Harper S (2012) Understanding web accessibility and its drivers. In: Proceedings of the International Cross-Disciplinary Conference on Web Accessibility, ACM, New York, NY, USA, W4A ’12, pp 19:1–19:9, https://doi.org/10.1145/2207016.2207027

  • Yesilada Y, Harper S, Eraslan S (2013) Experiential transcoding: An eyetracking approach. In: Proceedings of the 10th International Cross-Disciplinary Conference on Web Accessibility, ACM, New York, NY, USA, W4A ’13, pp 30:1–30:4, https://doi.org/10.1145/2461121.2461134

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sukru Eraslan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer-Verlag London Ltd., part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Eraslan, S., Bailey, C. (2019). End-User Evaluations. In: Yesilada, Y., Harper, S. (eds) Web Accessibility. Human–Computer Interaction Series. Springer, London. https://doi.org/10.1007/978-1-4471-7440-0_11

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-7440-0_11

  • Published:

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-4471-7439-4

  • Online ISBN: 978-1-4471-7440-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics