Skip to main content
Log in

Overturning some assumptions about the effects of evaluation systems on publication performance

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

In 1989 the Spanish Government established an individual retrospective research evaluation system (RES) for public researchers. Policy makers have associated the establishment of this evaluation system with the significant increase in the volume of scientific publications attributed to Spain over the last decades. In a similar vein to the analyses of other country cases, some scholars have also claimed that the growth of Spain’s international scientific publications is a result of the establishment of the new evaluation system. In this paper, we provide a methodological revision of the validity threats in previous research, including some interrupted time-series analyses and control groups to investigate the effects of this policy instrument on the number of papers produced by Spanish authors. In the years following the establishment of the evaluation system, the results indicate a considerable increase in the number of papers attributed to Spanish authors among those eligible for evaluation (the “treated” group), but also in the control groups. After testing various alternative explanations, we conclude that the growth in Spanish publications cannot be attributed indisputably to the effect of the establishment of the RES, but rather to the increase of expenditure and number of researchers in the Spanish R&D system along with some maturation effects. We take this case as an example of the need to improve and refine methodologies and to be more cautious when attributing effects to research evaluation mechanisms at the national level.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Notes

  1. For example, Juan Rojo, the Vice-Minister of Universities and Research, stated in an interview in a Spanish newspaper that the system was an incentive appropriate for rewarding the most productive researchers (La Vanguardia 24 March 1990). http://hemeroteca.lavanguardia.es/preview/1990/03/24/pagina-15/33007225/pdf.html. Accessed 10 May 2010.

  2. More information at http://www.educacion.es/horizontales/ministerio/organismos/cneai.html. Accessed 10 May 2010.

  3. Main contributions to be considered as such were journal papers, books and patents.

  4. From 1989 to 2005, 38,872 tenured university professors and 2,434 CSIC researchers were voluntarily evaluated. For each group 47 and 75% of them had all sexenios positively evaluated; 25 and 21% got some sexenios positively evaluated and 28 and 4% did not have any sexenios, either because the did not apply or were not approved. More information is available at http://www.educacion.es/horizontales/ministerio/organismos/cneai/memorias-informes.html; accessed 3 May 2010. There is also some evidence that the amount of positive evaluations, at institutional level, correlate with the volume of ISI publications (Scimago 2006).

  5. In this paper, we adopt an aggregated level of analysis, although we acknowledge that there might be differences among scientific fields that could hide the effects in some areas. There is evidence that the average number of ISI publications associated to granting a sexenio in the different areas is very diverse (Scimago 2007).

  6. Despite the limitations of this database for non-English speaking countries (Moed et al. 1995), for the period selected, it seems the most suitable. We use the number of documents, including all items, not just the citable documents. A Spanish paper was defined as a paper in which the list of the authors’ corporate addresses contains at least one Spanish institution; thus, we use whole counting (not fractional).

  7. Additionally, the growth could be the result of instrumentation effects (e.g. simple growth of papers in the databases, the increase in the number of journals included in the databases or, more specifically, Spanish journals) which we also discuss in the next section. The Spanish growth was not attributable to changes in the coverage of Spanish journals in ISI databases (Gómez et al. 1995). The coverage of journals from editors from Spain in ISI even diminished down to 29 in SCI, 2 in SSCI and 15 in AHI in early 2000 (Gómez et al. 2006).

  8. International collaboration involves authors from at least two different countries, with the output then being attributed in whole counting to all countries.

  9. Reports elaborated by different international consulting companies were made public [e.g. SRI International (1988)]; the increasing diffusion of country bibliometric analysis was important too (Braun et al. 1985).

  10. We have analysed whether the number of papers published in Spanish language included in the databases could be the “cause of growth”: in 1980, 41.6% of papers (1,670) from Spain were published in Spanish, while in 1990 the share was 17.6 (1,976) and 7.25% (1,916) in 2005. Zitt et al. (1998) clearly present Spain as a country that has made the “transition” to publication in English in SCI-ISI.

  11. Just to make clear that it was not an instrumentation effect we have to mention that Spain had 0.62% of the World share in 1979, 0.96% in 1984, 1.47% 1989 and 2.09% 1994 (Van Raan 1997).

  12. See Fig. 3 in Jiménez-Contreras et al. (2003).

  13. We should bear in mind that at the beginning of the 1980s labour costs of researchers represented around 70% of the R&D expenditure.

  14. Recently Rodriguez-Navarro (2009) has argued that the Spanish evaluation system does not promote high quality or outstanding results, despite the fact that aggregate average impact factor of Spain has improved over the last years up to the average world (Gómez et al. 2006).

References

  • Andersen, L. B., & Pallesen, T. (2008). “Not Just for the Money?” How financial incentives affect the number of publications at Danish Research Institutions. International Public Management Journal, 11(1), 28–47.

    Article  Google Scholar 

  • Braun, T., & Schubert, A. (1988). World flash on basic research: Scientometric versus socio-economic indicators. Scatter plots for 51 countries. 1978–1980. Scientometrics, 13(1–2), 1–9.

    Google Scholar 

  • Braun, T., Glänzel, W., & Schubert, A. (1985). Scientometrics indicators. A 32-country comparative evaluation of publishing performance and citation impact. Singapore, Philadelphia: World Scientific.

    Google Scholar 

  • Butler, L. (2003a). Explaining Australia’s increased share of ISI publications. The effects of a funding formula based on publication counts. Research Policy, 32(1), 143–155.

    Article  Google Scholar 

  • Butler, L. (2003b). Modifying publication practices in response to funding formulas. Research Evaluation, 12(1), 39–46.

    Article  Google Scholar 

  • Campbell, D. F. J. (2003). The evaluation of university research in the United Kingdom and the Netherlands, Germany and Austria. In Ph. Shapira & S. Kuhlman (Eds.), Learning from science and technology policy evaluation (pp. 98–131). Cheltenham (UK): Edward Elgar.

    Google Scholar 

  • Campbell, D. T. (1996). Regression artifacts in time-series and longitudinal data. Evaluation and Program Planning, 19(4), 377–389.

    Article  Google Scholar 

  • Campbell, D. T., & Ross, H. L. (1968). The Connecticut crackdown on speeding. Time-series data in quasi-experimental analysis. Law and Society Review, 3(1), 32–54.

    Article  Google Scholar 

  • Cook, T. D., & Campbell, D. T. (1979). Quasi-experimentation: Design and analysis issues for field settings. Boston: Houghton Mifflin Co.

    Google Scholar 

  • Crespi, G. A., & Geuna, A. (2008). An empirical study of scientific production: A cross country analysis (1981–2002). Research Policy, 37(4), 565–579.

    Article  Google Scholar 

  • Cruz-Castro, L., & Sanz-Menéndez, L. (2007). Research evaluation in transition: individual versus organisational assessment in Spain. In R. Whitley & J. Glaser (Eds.), The changing governance of the sciences. The advent of the research evaluation systems (pp. 205–224). Dordrecht (NL): Springer.

    Chapter  Google Scholar 

  • Cruz-Castro, L., Sanz-Menéndez, L. & Martínez, C. (2010). Research centers in transition: patterns of convergence and diversity. The Journal of Technology Transfer, doi: 10.1007/s10961-010-9168-5 (forthcoming).

  • Georghiou, L., Howells, J., Rigby, J. Glynn, S., Butler, J., Cameron, H., et al. (2000). Impact of the research assessment exercise and the future of quality assurance in the light of changes in the research landscape. A report produced by PREST, University of Manchester for HEFCE Higher Education Funding Council for England. April 2000. http://www.mbs.ac.uk/research/innovation/publications-archive/reports.aspx. Accessed 10 May 2010.

  • Geuna, A., & Martin, B. (2003). University research evaluation and funding: An international comparison. Minerva, 41(4), 277–304.

    Article  Google Scholar 

  • Gläser, J. (2007). The social orders of research evaluation systems. In R. Whitley & J. Glaser (Eds.), The changing governance of the sciences. The advent of the research evaluation systems (pp. 245–266). Dordrecht (NL): Springer.

    Chapter  Google Scholar 

  • Gómez, I., Fernández, M. T., Zulueta, M. A., & Camí, J. (1995). Analysis of biomedical research in Spain. Research Policy, 24(3), 459–471.

    Article  Google Scholar 

  • Gómez, I., Sancho, R., Bordons, M., & Fernández, M. T. (2006). La I+D en España a través de publicaciones y patentes. In J. Sebastián & E. Muñoz (Eds.), Radiografía de la investigación pública en España (pp. 275–302). Madrid: Biblioteca Nueva.

    Google Scholar 

  • Hicks, D. (2009). Evolving regimes of multi-university research evaluation. Higher Education, 57(4), 393–404.

    Article  Google Scholar 

  • Ingwersen, P., & Jacobs, D. (2004). South African research in selected scientific areas: Status 1981–2000. Scientometrics, 59(3), 405–423.

    Article  Google Scholar 

  • Jiménez-Contreras, E., & Ferreiro-Aláez, L. (1996). Publishing abroad: Fair trade or short sell for non-english-speaking authors: A Spanish Study. Scientometrics, 36(1), 81–95.

    Article  Google Scholar 

  • Jiménez-Contreras, E., Moya-Anegón, F., & López-Cózar, E. (2003). The evolution of research activity in Spain. The impact of National Commission for the Evaluation of Research Activity (CNEAI). Research Policy, 32(1), 123–142.

    Article  Google Scholar 

  • Laudel, G. (2006). The art of getting funding: How scientists adapt to their funding conditions. Science and Public Policy, 33(7), 489–504.

    Article  Google Scholar 

  • Liefner, I. (2003). Funding, resources allocation, and performance in higher education systems. Higher Education, 46(4), 469–489.

    Article  Google Scholar 

  • Méndez, A., & Gómez, I. (1986). The Spanish scientific productivity through eight international databases. Scientometrics, 10(3–4), 207–219.

    Article  Google Scholar 

  • Moed, H. (2008). UK research assessment exercises: Informed judgements on research quality or quantity? Scientometric, 74(1), 153–161.

    Article  Google Scholar 

  • Moed, H. F., Debruin, R. E., & Van Leeuwen, T. N. (1995). New bibliometric tools for the assessment of national research performance: Data base description, overview of indicators and first applications. Scientometrics, 33(3), 381–422.

    Article  Google Scholar 

  • Moed, H. F., Leeuwen, T. N., & Visser, M. S. (1999). Trends in publication output and impact of universities in the Netherlands. Research Evaluation, 8(1), 60–67.

    Article  Google Scholar 

  • Mohr, L. B. (2000). Regression artifacts and other customs of dubious desert. Evaluation and Program Planning, 23(4), 397–409.

    Article  Google Scholar 

  • Moya-Anegón, F., & Herrero-Solana, V. (1999). Science in America Latina: A comparison of bibliometric and scientific-technical indicators. Scientometrics, 46(2), 299–320.

    Article  Google Scholar 

  • Moya-Anegón, F., et al. (2007). Indicadores bibliométricos de la actividad científica Española (1990–2004). Madrid: FECYT.

    Google Scholar 

  • Önder, C., Sevkli, M., Altinok, T., & Tavukçuoglu, C. (2008). Institutional change and scientific research: A preliminary bibliometric analysis of institutional influences on Turkey’s recent social science publications. Scientometrics, 76(3), 543–560.

    Article  Google Scholar 

  • Persson, O., Glänzel, W., & Danell, R. (2004). Inflationary bibliometric values: The role of scientific collaboration and the need for relative indicators in evaluative studies. Scientometrics, 60(3), 421–432.

    Article  Google Scholar 

  • Rodriguez-Navarro, A. (2009). Sound research, unimportant discoveries: Research, universities, and formal evaluation of research in Spain. Journal of the American Society for Information Science and Technology, 60(9), 1845–1858.

    Article  Google Scholar 

  • Sanz-Menéndez, L. (1995a). Policy choices, institutional constraints and policy learning: The Spanish science and technology policy in the eighties. International Journal of Technology Management, 10(4/5/6), 255–274.

    Google Scholar 

  • Sanz-Menéndez, L. (1995b). Research actors and the state: Research evaluation and evaluation of science and technology policies in Spain. Research Evaluation, 5(1), 79–88.

    Google Scholar 

  • Sanz-Menéndez, L. (1997). Estado, ciencia y tecnología en España (1939–1997). Madrid: Alianza Editorial.

    Google Scholar 

  • Sanz Menéndez, L., & Pfretzchner, J. (1992). Política científica y gestión de la investigación: El CSIC (1986–1990) en el sistema español de ciencia y tecnología. Arbor, 557(Mayo), 9–51.

    Google Scholar 

  • Scimago, G. (2006). Producción ISI y tramos de investigación: Cómo combinarlos en un nuevo indicador. El profesional de la información, 15(3), 227–228.

    Article  Google Scholar 

  • Scimago, G. (2007). Producción ISI y tramos de investigación: Cómo combinarlos en un nuevo indicador (II). El profesional de la información, 16(4), 510–511.

    Google Scholar 

  • Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston, New York: Houghton Mifflin Co.

    Google Scholar 

  • Shelton, R. D. (2008). Relations between national research investment and publication output: Application to an American paradox. Scientometrics, 74(2), 191–205.

    Article  Google Scholar 

  • SRI International (1988). Research activity in Spain, Portugal and Greece. A 1988 Bibliometric model assessment. (Report prepared by Ailes, Coward and Fresne), Science and Technology Policy Programme, SRI International, Arlington (Vi), mimeo.

  • Talib, A. A. (2001). The continuing behavioural modification of academics since the 1992 research assessment exercise. Higher Education Review, 33(3), 30–46.

    Google Scholar 

  • Taylor, J. (2001). The impact of performance indicators on the work of university academics: Evidence from Australian universities. Higher Education Quarterly, 55(1), 42–61.

    Article  Google Scholar 

  • Van Raan, A. F. J. (1997). Science as an international enterprise. Science and Public Policy, 24(5), 290–300.

    Google Scholar 

  • Westerheijden, D. (1997). A solid base for decisions: Use of the VSNU research evaluation in Dutch Universities. Higher Education, 33(4), 397–413.

    Article  Google Scholar 

  • Whitley, R. (2003). Competition and pluralism in public sciences: The impact of institutional frameworks on the organization of academic science. Research Policy, 32(6), 1015–1039.

    Article  Google Scholar 

  • Whitley, R. (2007). The changing governance of the public sciences: The consequences of research evaluation systems for knowledge production in different countries and scientific fields. In R. Whitley & J. Glaser (Eds.), The changing governance of the sciences. The advent of the research evaluation systems (pp. 3–27). Dordrecht (NL): Springer.

    Chapter  Google Scholar 

  • Whitley, R., & Glaser, J. (Eds.). (2007). The changing governance of the sciences. The advent of the research evaluation systems. Dordrecht (NL): Springer.

    Google Scholar 

  • Zitt, M., Perrot, F., & Barré, R. (1998). The transition from “national” to “transnational” model and related measures of countries’ performance. Journal of the American Society of Information Science, 49(1), 30–42.

    Article  Google Scholar 

Download references

Acknowledgments

This research has been funded by the Ministry of Science and Innovation (CSO-2008-03100/SOCI) and the AECID (D-026992/09). We acknowledge the contribution of Zaida Chinchilla and Félix de Moya for providing us with the annual publication data. A previous version of this paper was presented at the International Sociological Association (ISA) Conference (Research Committee 23) in Barcelona (September 2008); we acknowledge the comments, criticisms and suggestions from participants, especially those of Jochen Gläser and Maria Nedeva. Additional suggestions and criticisms from Félix de Moya, Jochen Gläser, Catalina Martinez and two anonymous reviewers that helped to improve the final version are acknowledged.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Luis Sanz-Menéndez.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Osuna, C., Cruz-Castro, L. & Sanz-Menéndez, L. Overturning some assumptions about the effects of evaluation systems on publication performance. Scientometrics 86, 575–592 (2011). https://doi.org/10.1007/s11192-010-0312-7

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-010-0312-7

Keywords

Navigation