Towards a multi-paradigmatic, value free informetrics: A reply to Paul Wouters’ book review “The failure of a paradigm”

https://doi.org/10.1016/j.joi.2018.03.003Get rights and content

Section snippets

Preamble

In this contribution I present my comments on the most important statements in Paul Wouters’ review “The failure of a paradigm” (Wouters, 2018) of my book “Applied Evaluative Informetrics” (Moed, 2017), statements that seem often based on a mismatch between the book's intentions and Wouters’ expectations, but, fortunately, some of which properly mark essential differences in view between his and my own thinking. I focus on the three key aspects expressed in the book’s title, namely that its

The book’s intended audience

The book is directed towards a broad academic audience, not merely to the research community in the field of bibliometrics, informetrics, quantitative science studies, or social studies of science. I have deliberately chosen to provide broadness and background, and to leave out technical details. This is not so much a matter of taste or style, but the consequence of the notion that it is a matter of social responsibility that specialists in the field seek to explain to the “outside world” in an

The informetric dimension

The book focuses on the role of informetrics in research assessment. The use of the term informetrics rather than bibliometrics in the book’s title marks an important development not only in the field of quantitative science studies, but virtually in all domains of science and scholarship, and in society at large, namely the computerization or digitalization of information and communication. The book aims to adopt a broad perspective on this development as well. It does not only show how new

The book’s applied character

The book deals with the application of informetric techniques in research assessment. If it deserves the qualification scientific, it is applied science, but one that is fully aware of the need of a theoretical foundation of assessment methods and practices. The book does not give a comprehensive research agenda in the field of social studies of science. Nor does it aim to continue the technical debates on informetric issues taking place in the specialist journals in our field.

It aims to create

Facts and values

For many years I have been actively involved in debates about bibliometric indicators, during which I more and more realized that in seemingly technical debates, political or policy considerations influenced the functional form of indicators. I included an entire chapter (Chapter 7) to illustrate this by means of a series of concrete examples, and I want to invite all colleagues in the field – and of course all interested outsiders – to read it. It also shows how indicator concepts mirror the

Evaluation science

The book expresses the multi-disciplinarity and multi-paradigmatic nature of evaluation science, that asks for the development of an inter-disciplinary view, a task that cannot be easily achieved with practitioners who assign an absolute or preferred status to one particular approach or paradigm. The book definitely aims to raise the discussion beyond the framework of a single paradigm. Its message is: let us look broader than that, there are so many issues in business studies, educational

Evaluative frameworks

The notion of an evaluative framework plays an important role in my book. It refers to a specification of the qualitative principles that provide guidance to a concrete assessment process. A core element in an evaluative framework is the specification of a performance criterion, in a set of propositions on what constitutes research quality or performance. In short, they state what is valuable. As argued above, the book defends the position that such values cannot be grounded in

About constitutive effects

In his concluding paragraph Wouters claims that my book “tries to ignore the formative effects of evaluations in general and informetric tools in particular on the character of the process of knowledge creation by narrowly focusing on “performance enhancement” and ignoring the social effects (not at all necessarily detrimental by the way) on the research system and on individual research groups”. Fortunately, Wouters gives several quotes showing that the book does not ignore these effects at

The way forward

In a chapter entitled “The Way Forward in Quantitative Research Assessment” the book questions what it identified as a base assumption of the application of bibliometric indicators in the 1980s, namely that “it is not the potential influence but the actual influence, not the importance but the impact that is most closely linked to the notion of scientific-scholarly progress, and that in an actual research assessment, it is not the importance but the impact that is of primary interest to policy

Concluding remarks

I wish to thank Paul Wouters for his efforts to write his book review. Perhaps the most intriguing element of Wouters’ review is its title. The failure of a paradigm! It is not clear to me which paradigm he is referring to. Is it perhaps the critical rationalist school of thought, providing an intellectual basis for the thesis that evaluative informetrics – and also the social construction of evaluation – do themselves not evaluate? In this case it is clear to me that Paul Wouters is a strong

References (10)

  • H. Albert

    Wertfreiheit als methodische Prinzip

  • Critical Rationalism”, n.d....
  • P. Dahler-Larsen

    Constitutive effects of performance indicators: Getting beyond unintended consequences

    Public Management Review

    (2013)
  • W. Glanzel et al.

    Springer Handbook of Science and Technology Indicators

    (2018)
  • Hans Albert, n.d....
There are more references available in the full text version of this article.

Cited by (1)

View full text