An interpretive approach to evaluating information systems: A content, context, process framework

https://doi.org/10.1016/j.ejor.2005.07.006Get rights and content

Abstract

An evaluation framework is proposed reflecting the content, context, process (CCP) perspective developed from existing IS literature. Evaluation is guided by addressing the questions: why is the evaluation is being done? What is being evaluated? Who affects the evaluation? When is the evaluation taking place? And how is the evaluation to be carried out? The framework reflects the identified need for more holistic processes for evaluating information systems and explains the role of interpretive methodologies in identifying the complex interplay of issues. The framework reflects the social, political and cultural factors that influence the economic benefits and emphasises the need for an integrated approach to evaluation.

Introduction

In the rich field of information systems (IS) evaluation literature there are many arguments advocating different methods and approaches for conducting effective evaluations. As systems have become more complex and interconnected the need for evaluation processes that allow for the true contribution of an IS to be recognised has increased. This reappraisal of IS evaluation has coincided with and been influenced by a paradigm shift in the way evaluation is perceived in other disciplines (Guba and Lincoln, 1989).

In IS, the adoption of a broader view has become yet more important with the advent of e-commerce that connects firms and customers in new ways, and where the system is a competitive necessity rather than a competitive advantage. This paper’s contribution is a holistic evaluation framework that considers the context, content and process of evaluation with the detailed evaluation factors that address what is being evaluated, why the evaluation is being done, those that influence the evaluation, the timing of the evaluation and how the evaluation is to be carried out.

Section snippets

Information systems evaluation

The calls for interpretive approaches to IS evaluation that incorporate the recognition of information systems as both social and technical entities have increased since the late 1980s (Hirschheim and Smithson, 1988, Symons, 1991, Walsham, 1993). Hirschheim and Smithson (1988) argue that treating IS evaluation as a technical problem leads to meaningless conclusions that overlook the social activity inherent in the evaluation process and ignores the political–social environment of an

The CCP framework for evaluating information systems

A detailed examination of the literature has enabled the identification of the potential factors that influence the content, context and the process of evaluation and supports the construction of a framework to support information systems evaluation. The overarching categorisation of relevant IS evaluation literature as presented by Symons (1991) is used as a starting point and is extended to provide explanations of the significant constructs and alternatives for evaluation. The features of the

Methodological implications

Use of the CCP framework requires an approach that supports understanding of the nuances, influences and perceptions of those involved in the evaluation, and the way they are in turn influenced by the context of the organisation. This provides justification for the use of an interpretive methodology that allows for the element of sense-making in a complex situation, taking into account multiple interpretations and drawing lessons from the evaluation process that can be used to improve future

Strengths and limitations of the framework

The CCP framework proposed in this paper makes a distinct contribution to the evaluation literature because it provides the detailed factors that need to be considered in evaluation yet it remains sufficiently flexible to be of value in evaluating a wide range of projects. In this respect, the CCP framework provides a high level structure for evaluation (content, context, process) with a detailed breakdown of sub-constructs or elements. Much of the evaluation literature provides perspectives on

Conclusions

If information systems are considered to be social as well as technical entities and stakeholders are key to every stage of IS, then the traditional methods of IS evaluation based on the use of technical measures are no longer sufficient.

The extended CCP framework gives a model that offers a structure against which individual evaluation studies can be planned and carried out. The structure incorporates the elements that will contribute to the rich, holistic studies advocated by current

References (63)

  • W.H. DeLone

    Determinants of success for computer usage in small business

    MIS Quarterly

    (1988)
  • W.H. DeLone et al.

    Information systems success: The quest for the dependent variable

    Information Systems Research

    (1992)
  • W.H. DeLone et al.

    The DeLone and McLean model of information systems success: A ten-year update

    Journal of Management Information Systems

    (2003)
  • W.J. Doll et al.

    A confirmatory factor analysis of the user information satisfaction instrument

    Information Systems Research

    (1995)
  • B. Farbey et al.

    How to Evaluate Your IT Investment

    (1993)
  • B. Farbey et al.

    A taxonomy of information systems applications: The benefits’ evaluation ladder

    European Journal of Information Systems

    (1995)
  • G.M. Giaglis et al.

    The ISSUE methodology for quantifying benefits from information systems

    Logistics Information Management

    (1999)
  • A. Giddens

    Central Problems in Social Theory: Action, Structure and Contradiction in Social Analysis

    (1979)
  • D.L. Goodhue

    Development and measurement validity of a task-technology fit instrument for user evaluations of information systems

    Decision Sciences

    (1998)
  • A.J. Gregory et al.

    Evaluation methodologies: A system for use

    The Journal of the Operational Research Society

    (1992)
  • E.G. Guba et al.

    Fourth Generation Evaluation

    (1989)
  • R. Hirschheim et al.

    A critical analysis of information systems evaluation

  • E.R. House

    Evaluating with Validity

    (1980)
  • E. Huerta et al.

    Evaluation of information technology: Strategies in Spanish firms

    European Journal of Information Systems

    (1999)
  • Z. Irani et al.

    Information systems evaluation: Past, present and future

    European Journal of Information Systems

    (2001)
  • Z. Irani et al.

    Developing a frame of reference for ex-ante IT/IS investment evaluation

    European Journal of Information Systems

    (2002)
  • B. Ives et al.

    User involvement and MIS success: A review of research

    Management Science

    (1984)
  • J. Jasperson et al.

    Power and information technology research: A meta triangualtion review

    MIS Quarterly

    (2002)
  • S. Jones et al.

    Understanding IS evaluation as a complex social process: A case study of a UK local authority

    European Journal of Information Systems

    (2001)
  • H.K. Klein et al.

    A set of principles for conducting and evaluating interpretive field studies in information systems

    MIS Quarterly

    (1999)
  • F.F. Land

    Evaluation in a socio-technical context

  • Cited by (140)

    • A taxonomy of scaling agility

      2022, Journal of Strategic Information Systems
    • A conceptual framework for valuing IT within a business system

      2020, International Journal of Accounting Information Systems
    • “Blind leading the blind”: Qualitative evaluation of unanticipated difficulties during nurse testing of a hospital health information system

      2020, Collegian
      Citation Excerpt :

      An iterative thematic analysis was undertaken by two independent researchers and themes were confirmed by a third experienced qualitative researcher (Ritchie & Spencer, 2002). The qualitative analysis and results were informed by the content, context and process framework (Eslami Andargoli et al., 2017; Pettigrew, 1985; Stockdale & Standing, 2006; Symons, 1991). This framework has been extensively validated as a tool for evaluating HIS.

    • Evaluating clinical decision support software (CDSS): Challenges for robust evidence generation

      2024, International Journal of Technology Assessment in Health Care
    View all citing articles on Scopus
    View full text