Skip to main content
Log in

A diversity-sensitive evaluation method

  • Long Paper
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

This paper presents an evaluation method, along with the underlying theory, for assessing interactive systems and specifying their quality in terms of universal access. The method is an adaptation of traditional walkthroughs and is aimed to incorporate user diversity, for example in terms of individual abilities, skills, background, levels of expertise, equipment used, etc., as key input to evaluation. The method aims at addressing as many as possible of the qualities of a system that might affect diverse users throughout their usage of the system and which, ultimately, have an impact on the system’s wide acceptance. The proposed method, described here, extends the cognitive walkthrough method by introducing a simulation of the users’ reasoned action process in order to assess whether users can, and will be, in favour of accessing, exploring, utilising and, ultimately, adopting a system. Additionally, the method allows considering in the assessment process various aspects of diversity among target users and use conditions, rather than assessing for the so-called average user, aiming at incorporating accessibility, usability and acceptance as intrinsic measurements. Finally, the paper presents ORIENT, a prototype inspection tool developed as a means to further facilitate experts in conducting such walkthroughs in practice, and which offers step-by-step guidance throughout the process until final reporting. Preliminary experiences with the application of the method in the domain of e-Services are also discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Notes

  1. e-Accessibility means ensuring that ICT products and services are useable by as many people as possible, and in particular by people with special needs due to disabilities [9].

  2. e-Inclusion (‘e’ standing for electronic) means ensuring that digital technologies are both open to everyone, without barriers, and are used to overcome social and economic exclusion [9].

  3. “It is not the utility, but the usability of a thing which is in question” [7].

  4. Forming the goal; forming the intention; specifying an action; executing the action; perceiving the state of the world; interpreting the state of the world; and evaluating the outcome.

  5. The term “system” is used to refer to various types of interactive artifacts including services, software or hardware products, user interface components and their underlying functionality or any combination of these.

  6. The term “individual” user refers to individual conditions of use (user characteristics, context of use and behavioural situations).

  7. The use of the term accessibility (in its literal sense) is avoided here in order to ensure that this is not confused with as ease of access for people with disability.

  8. The factor “competitiveness” has not been implemented in the ORIENT tool described in later sections.

  9. Publish: the user simply accesses (searches and retrieves) information; there is no other communication between the user and the system.

  10. Interact: the user accesses dynamic information but does not act upon it (i.e., cannot modify the system’s data).

  11. Transact: the user accesses dynamic information and has rights for changes.

  12. Collaboration: can be asynchronous (e.g., track changes facilities, shared documents area) or synchronous (collaborative virtual environments).

  13. Human–human communication: can be asynchronous (emails, message boards, annotations, etc.) or synchronous (chat, webcams, etc.).

  14. Social interaction and navigation: can be real or virtual (i.e., through the system).

  15. That is between (a) the ways employed to promote the system and to provide potential/target users with information about its existence and utility and (b) the sources of information actually used by the users.

  16. As these are averred by the dissemination materials of the system and implied by the physical platform of the system.

  17. Exploration in width.

  18. Occasional use.

  19. Exploitation in depth.

References

  1. Antona, M., Mourouzis, A., Kartakis, G., Stephanidis, C.: User requirements and usage life-cycle for digital libraries. In: Jacko, J. Kathlene, V. Leonard (eds.) Emergent application domains in HCI—Volume 5 of the Proceedings of the 11th International Conference on Human–Computer Interaction (HCI International 2005), Las Vegas, Nevada, USA, 22–27 July. Mahwah, New Jersey: Lawrence Erlbaum Associates. [CD-ROM] (2005)

  2. Antona, M., Mourouzis, A., Kastrinaki, A., Boutsakis, E., Stephanidis C.: User-orientation inspection of ten European e-Services: results and lessons learned. FORTH-ICS Technical Report TR-373 (2006)

  3. Benyon, D., Crerar, A., Wilkinson, S.: Individual differences and inclusive design. In: Stephanidis, C. (ed.) User Interfaces for All—Concepts, Methods, and Tools, pp. 21–46. Mahwah, Lawrence Erlbaum Associates, NJ (ISBN 0-8058-2967-9) (2001)

  4. Brajnik, G.: Beyond conformance: the role of accessibility evaluation methods. In: Hartmann, S. et al. (eds.) WISE 2008, LNCS 5176, pp. 63–80, (2008)

  5. Coyne, K., Nielsen, J.: How to conduct usability evaluations for accessibility. A Nilesen Norman Group Report. Fremont, CA 94539-7767 USA (2004)

  6. Davis, F.: Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 13(3 (September)), 319–340 (1989)

    Article  Google Scholar 

  7. De Quincey, T.: Ricardo and Adam Smith (Part III). Blackwoods Mag. 52, 718–739 (1842)

    Google Scholar 

  8. Day, H., Jutai, J., Woolrich, W., Strong, G.: The stability of impact of assistive devices. Disabil. Rehabil. 23(9), 400–404 (2001)

    Article  Google Scholar 

  9. European Commission: BEING PART OF IT: European research for an inclusive information society. Office for Official Publications of the European Communities, Luxembourg. ISBN 978-92-79-08587-1 (2008)

  10. eUSER.: eUSER conceptual and analytical framework (first version). In: Cullen, K. (ed.) eUSER Deliverable D1.1, Part A (2004)

  11. Featherman, M.S., Pavlou, P.A.: Predicting e-Services adoption: a perceived risk facets perspective. Int. J. Hum. Comput. Stud. 59(4), 451–474 (2003)

    Article  Google Scholar 

  12. Fishbein, M., Ajzen, I.: Belief, Attitude, Intention and Behavior: An Introduction to Theory and Research. Addison-Wesley, Reading, MA (1975)

    Google Scholar 

  13. Garvin, D.A.: What Does “Product Quality” Really Mean? The Sloan Management Review (Autumn): 25–43 (1984)

  14. Gefen, D., Straub, D.: The relative importance of perceived ease-of-use in is adoption: a study of eCommerce adoption. JAIS. (1:8), 1–20 (2000)

  15. Grudin, J.: Utility and usability: research issues and development contexts. Interact. Comput. 4(2 (August)), 209–217 (1992)

    Article  Google Scholar 

  16. Henry, S.L., Law, C., Barnicle, K.: Adapting the design process to address more customers in more situations. Tutorial in UPA 2001 Conferrence, June 25–29, Lake Las Vegas, Nevada (2001)

  17. Henwood, F., Wyatt, S., Miller, N., Senker, P.: Critical perspectives on technologies, in/equalities and the information society. In: Wyatt, S., Henwood, F., Miller, N., Senker, P. (eds.) Technology and In/equality: Questioning the Information Society, 1–18. Routledge, London (2000)

    Google Scholar 

  18. Hofstede, G.: Cultures, Organizations. Software of the Mind. McGraw-Hill, New York (1997)

    Google Scholar 

  19. Holzinger, A.: Usability engineering for software developers. Commun. ACM 48(1), 71–74 (2005)

    Article  Google Scholar 

  20. Holzinger, A., Schaupp, K., Eder-Halbedl, W.: An investigation on acceptance of ubiquitous devices for the elderly in an geriatric hospital environment: using the example of person tracking 11th international conference on computers helping people with special needs, Lecture Notes in Computer Science (LNCS 5105), Springer, 22–29 (2008)

  21. ISO 9241.: ISO DIS 9241—Part 11: guidance on usability (1998)

  22. ISO 9001.: Quality management systems (2000)

  23. Kawakita, J.: The KJ method: a scientific approach to problem solving. Technical report, Kawakita Research Institute, Tokyo (1975)

  24. Khaslavsky, J.: Integrating culture into interface design. Proc CHI 1998 pp. 365–366 (1998)

  25. Kirakowski, J.: Questionnaires in Usability Engineering—A List of Frequently Asked Questions (3rd edn). Human Factors Research Group. Cork, Ireland. Retrieved September 11, 2003, from http://www.ucc.ie/hfrg/resources/qfaq1.html (2000)

  26. Koubek, R.J., Salvendy, G.: A conceptual model of human skill requirements for advanced manufacturing settings. In: Proceedings of the fifth international conference on human-computer interaction 1993. pp. 356–361 (1993)

  27. Law, C., Barnicle, K., Henry S.L.: Usability screening techniques: evaluating for a wider range of environments, circumstances and abilities. In: Proceedings of UPA2000 conference (Usability Professionals’ Association annual conference) (2000)

  28. Law, C., Vanderheiden, M.: Tests for screening product designs prior to user testing by people with functional limitations. HFES’99 (Human Factors & Ergonomics Society annual meeting), Houston, TX, September 27–October 1 (1999)

  29. Lee, Y., Kozar, K.A., Larsen, K.R.T.: The technology acceptance model: past, present and future. Commun. Assoc. Inf. Syst. 12(50), 752–780 (2003)

    Google Scholar 

  30. Lepistö, A., Ovaska, S.: Usability evaluation involving participants with cognitive disabilities. In: Proceedings of the third Nordic conference on human-computer interaction (Tampere, Finland, October 23–27, 2004). NordiCHI ‘04, vol. 82, pp. 305–308. ACM, New York, NY (2004)

  31. Mack, R.L., Nielsen, J.: Executive summary. In: Nielsen, J., Mack, R.L. (eds.) Usability Inspection Methods, pp. 1–23. Wiley, New York, NY (1994)

    Google Scholar 

  32. MAUSE.: Maturation of usability evaluation methods: retrospect and prospect. In: Law, E.L.C., Scapin, D., Cockton, G., Springett, M., Stary, C., Winckler, M. (eds.) Final Reports of COST294—Working Groups. IRIT Press, Toulouse, France, p. 188. ISBN:978-2-917490-06-8 (2009)

  33. Molich, R., Kindlund, E.: Improving your skills in usability testing. CHI2000 Tutorial (2000)

  34. Moore, G.C., Benbasat, I.: Development of an instrument to measure the perceptions of adopting an information technology innovation. Inf. Syst. Res. 2(3), 192–222 (1991)

    Article  Google Scholar 

  35. MORI.: Measuring & understanding customer satisfaction. A MORI Review for the Office of Public Services Reform. London: The Prime Minister’s Office of Public Services Reform (2002)

  36. Morris, M.G., Dillon, A.: The influence of user perceptions on software utilization: application and evaluation of a theoretical model of technology acceptance. IEEE Softw. 14(4), 6–58 (1997)

    Article  Google Scholar 

  37. Mourouzis, A., Antona, M., Boutsakis, E., Stephanidis, C.: An evaluation framework incorporating user interface accessibility. In: Stephanidis, C. (ed.) Universal Access in HCI: Exploring New Dimensions of Diversity—Volume 8 of the Proceedings of the 11th international conference on human-computer interaction (HCI International 2005), Las Vegas, Nevada, USA, 22–27 July. Mahwah, New Jersey: Lawrence Erlbaum Associates [CD-ROM] (2005)

  38. Mourouzis, A., Antona, M., Boutsakis, E., Kastrinaki, A., Stephanidis, C.: User-orientation evaluation framework for e-Services: inspection tool and usage guidelines. FORTH-ICS Technical Report, TR-372 (2006a)

  39. Mourouzis, A., Antona, M., Boutsakis, E., Stephanidis, C.: A user-orientation evaluation framework: assessing accessibility throughout the user experience lifecycle. In: Miesenberger, K., et al. (eds.) ICCHP 2006, LNCS 4061, pp. 421–428 (2006b)

  40. Nielsen, J.: Usability Engineering. Academic Press Limited (1993)

  41. Norman, D.A.: Cognitive engineering. In: Norman, D.A., Draper, S.W. (eds.) User Centered System Design: New Perspectives in Human-Computer Interaction, pp. 31–61. Lawrence Erlbaum Assoc, Hillsdale, NJ (1986)

    Google Scholar 

  42. Norman, D.A.: The Psychology of Everyday Things. Basic Books, New York (1988)

    Google Scholar 

  43. Paciello, M.: Assessing usability for people with disabilities through remote evaluation and critical incident reporting. Available at: http://www.paciellogroup.com/whitepapers/WPAssessingUsability.html (2002)

  44. Pavlou, P.A.: Consumer intentions to adopt electronic commerce—incorporating trust and risk in the technology acceptance model. Paper presented at the 2001 Diffusion Interest Group in Information Technology Workshop (2001)

  45. Petrie, H., Hamilton, F., King, N., Pavan, P.: Remote usability evaluations with disabled people. In: CHI 2006: Proceedings of the SIGCHI conference on Human factors in computing systems, ACM, New York, pp. 1133–1141 (2006)

  46. Polson, P., Lewis, C., Rieman, J., Wharton, C.: Cognitive walkthroughs: a method for theory-based evaluation of user interfaces. Int. J. Mach. Stud. 36, 741–773 (1992)

    Article  Google Scholar 

  47. Polson, P.G., Lewis, C.H.: Theory-based design for easily learned interfaces. Hum. Comput. Interact. 5, 191–220 (1990)

    Article  Google Scholar 

  48. Rogers, E.: Diffusion of Innovations. The Free Press, New York (1993)

    Google Scholar 

  49. Salminen, A.-L., Petrie, H.: Evaluating assistive technology prototypes: laboratory or real life contexts? In: Proceedings of TIDE 1998 Conference, July 1998, Helsinki (1983)

  50. Savidis, A., Stephanidis, C.: Unified user interface design: designing universally accessible interactions. Int. J. Interact. Comput. 16(2), 243–270 (2004)

    Google Scholar 

  51. Scherer, M.J., Sax, C.: Measures of assistive technology predisposition and use. In: Mpofu, E., Oakland, T. (eds.) Assessment in Rehabilitation and Health. Allyn & Bacon, Boston, ISBN 0-205-50174-5 (2009)

  52. Scherer, M.J., Sax, C., Vanbeirvliet, A., Cushman, L.A., Scherer, J.V.: Predictors of assistive technology use: the importance of personal and psychosocial factors. Disabil. Rehabil. 27(21), 1321–1331 (2005)

    Article  Google Scholar 

  53. Shneiderman, B.: Pushing human-computer interaction research to empower every citizen. Universal usability. Communications of the ACM May 2000/vol. 43, No. 5, 85–91 (2000)

  54. Stephanidis, C. (ed.): User Interfaces for All—Concepts, Methods, and Tools. Lawrence Erlbaum Associates, Mahwah, NJ (ISBN 0-8058-2967-9, p. 760) (2001)

  55. Stephanidis, C., Salvendy, G., Akoumianakis, D., Bevan, N., Brewer, J., Emiliani, P.L., Galetsas, A., Haataja, S., Iakovidis, I., Jacko, J., Jenkins, P., Karshmer, A., Korn, P., Marcus, A., Murphy, H., Stary, C., Vanderheiden, G., Weber, G., Ziegler, J.: Toward an information society for all: an international R&D agenda. Int. J. Hum. Comput. Interact. 10(2), 107–134 (1998)

    Article  Google Scholar 

  56. Tornatzky, L.G., Klein, K.J.: Innovation characteristics and innovation adoption-implementation: a meta-analysis of findings. IEEE Trans. Eng. Manage. 29(1), 28–45 (1982)

    Google Scholar 

  57. Venkatesh, V., Bala, H.: Technology acceptance model 3 and a research agenda on interventions. Decis. Sci. 39(2), 273–315 (2008)

    Article  Google Scholar 

  58. Vigo, M., Kobsa, A., Arrue, M., Abascal, J.: User-tailored web accessibility evaluations. In: HyperText 2007, Manchester, UK, September 2007, ACM, New York. pp. 95–104 (2007)

  59. Warschauer, M.: Technology and Social Inclusion: Rethinking the Digital Divide. The MIT Press, Cambridge, MA, USA (2003)

    Google Scholar 

  60. Zeithaml, V.A., Parasuraman, A., Malhotra, A.: A conceptual framework for understanding e-Service quality: implications for future research and managerial practice. MSI Monograph, Report #00-115 (2001)

Download references

Acknowledgments

Part of this work has been carried out in the framework of the European Commission funded project eUSER (“Evidence-based support for the design and delivery of user-centred on-line public services”, Contract no. 507180).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Constantine Stephandis.

Additional information

Alexandros Mourouzis is currently affiliated with the Centre for Research and Technology Hellas (CERTH), Thessaloniki, Greece. The work reported in this paper was conducted while he was affiliated with the Institute of Computer Science of FORTH.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Mourouzis, A., Antona, M. & Stephandis, C. A diversity-sensitive evaluation method. Univ Access Inf Soc 10, 337–356 (2011). https://doi.org/10.1007/s10209-010-0211-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-010-0211-y

Keywords

Navigation