An interpretive approach to evaluating information systems: A content, context, process framework
Introduction
In the rich field of information systems (IS) evaluation literature there are many arguments advocating different methods and approaches for conducting effective evaluations. As systems have become more complex and interconnected the need for evaluation processes that allow for the true contribution of an IS to be recognised has increased. This reappraisal of IS evaluation has coincided with and been influenced by a paradigm shift in the way evaluation is perceived in other disciplines (Guba and Lincoln, 1989).
In IS, the adoption of a broader view has become yet more important with the advent of e-commerce that connects firms and customers in new ways, and where the system is a competitive necessity rather than a competitive advantage. This paper’s contribution is a holistic evaluation framework that considers the context, content and process of evaluation with the detailed evaluation factors that address what is being evaluated, why the evaluation is being done, those that influence the evaluation, the timing of the evaluation and how the evaluation is to be carried out.
Section snippets
Information systems evaluation
The calls for interpretive approaches to IS evaluation that incorporate the recognition of information systems as both social and technical entities have increased since the late 1980s (Hirschheim and Smithson, 1988, Symons, 1991, Walsham, 1993). Hirschheim and Smithson (1988) argue that treating IS evaluation as a technical problem leads to meaningless conclusions that overlook the social activity inherent in the evaluation process and ignores the political–social environment of an
The CCP framework for evaluating information systems
A detailed examination of the literature has enabled the identification of the potential factors that influence the content, context and the process of evaluation and supports the construction of a framework to support information systems evaluation. The overarching categorisation of relevant IS evaluation literature as presented by Symons (1991) is used as a starting point and is extended to provide explanations of the significant constructs and alternatives for evaluation. The features of the
Methodological implications
Use of the CCP framework requires an approach that supports understanding of the nuances, influences and perceptions of those involved in the evaluation, and the way they are in turn influenced by the context of the organisation. This provides justification for the use of an interpretive methodology that allows for the element of sense-making in a complex situation, taking into account multiple interpretations and drawing lessons from the evaluation process that can be used to improve future
Strengths and limitations of the framework
The CCP framework proposed in this paper makes a distinct contribution to the evaluation literature because it provides the detailed factors that need to be considered in evaluation yet it remains sufficiently flexible to be of value in evaluating a wide range of projects. In this respect, the CCP framework provides a high level structure for evaluation (content, context, process) with a detailed breakdown of sub-constructs or elements. Much of the evaluation literature provides perspectives on
Conclusions
If information systems are considered to be social as well as technical entities and stakeholders are key to every stage of IS, then the traditional methods of IS evaluation based on the use of technical measures are no longer sufficient.
The extended CCP framework gives a model that offers a structure against which individual evaluation studies can be planned and carried out. The structure incorporates the elements that will contribute to the rich, holistic studies advocated by current
References (63)
- et al.
Moving IS evaluation forward: Learning themes and research issues
Journal of Strategic Information Systems
(1999) - et al.
User evaluations of IS as surrogates for objective performance
Information and Management
(2000) - et al.
Information systems effectiveness: The construct space and patterns of application
Information and Management
(1996) Information systems evaluation: Navigating through the problem domain
Information and Management
(2002)The significance of context in information systems and organizational change
Information Systems Journal
(2001)- et al.
Development of a tool for measuring and analyzing computer user satisfaction
Management Science
(1983) - et al.
Information systems and other capital investments: Evaluation practices compared
Logistics Information Management
(1999) - et al.
Assessing the value of Conoco’s EIS
MIS Quarterly
(1993) - et al.
The ABCs of Evaluation
(2000) Evaluation Research. An Introduction to Principles, Methods and Practice
(1999)
Determinants of success for computer usage in small business
MIS Quarterly
Information systems success: The quest for the dependent variable
Information Systems Research
The DeLone and McLean model of information systems success: A ten-year update
Journal of Management Information Systems
A confirmatory factor analysis of the user information satisfaction instrument
Information Systems Research
How to Evaluate Your IT Investment
A taxonomy of information systems applications: The benefits’ evaluation ladder
European Journal of Information Systems
The ISSUE methodology for quantifying benefits from information systems
Logistics Information Management
Central Problems in Social Theory: Action, Structure and Contradiction in Social Analysis
Development and measurement validity of a task-technology fit instrument for user evaluations of information systems
Decision Sciences
Evaluation methodologies: A system for use
The Journal of the Operational Research Society
Fourth Generation Evaluation
A critical analysis of information systems evaluation
Evaluating with Validity
Evaluation of information technology: Strategies in Spanish firms
European Journal of Information Systems
Information systems evaluation: Past, present and future
European Journal of Information Systems
Developing a frame of reference for ex-ante IT/IS investment evaluation
European Journal of Information Systems
User involvement and MIS success: A review of research
Management Science
Power and information technology research: A meta triangualtion review
MIS Quarterly
Understanding IS evaluation as a complex social process: A case study of a UK local authority
European Journal of Information Systems
A set of principles for conducting and evaluating interpretive field studies in information systems
MIS Quarterly
Evaluation in a socio-technical context
Cited by (140)
A taxonomy of scaling agility
2022, Journal of Strategic Information SystemsIT service management evaluation method based on content, context, and process approach: A literature review
2021, Procedia Computer ScienceA conceptual framework for valuing IT within a business system
2020, International Journal of Accounting Information Systems“Blind leading the blind”: Qualitative evaluation of unanticipated difficulties during nurse testing of a hospital health information system
2020, CollegianCitation Excerpt :An iterative thematic analysis was undertaken by two independent researchers and themes were confirmed by a third experienced qualitative researcher (Ritchie & Spencer, 2002). The qualitative analysis and results were informed by the content, context and process framework (Eslami Andargoli et al., 2017; Pettigrew, 1985; Stockdale & Standing, 2006; Symons, 1991). This framework has been extensively validated as a tool for evaluating HIS.
Evaluating clinical decision support software (CDSS): Challenges for robust evidence generation
2024, International Journal of Technology Assessment in Health CareUsing machine learning to determine factors affecting product and product–service innovation
2024, Journal of Enterprise Information Management