Skip to main content
Log in

The research output of European higher education institutions

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

The measurement of the research output of the Higher Education Institutions (HEIs) is problematic, due to the multi-product nature of their teaching and research activities. This study analyses the difficulties related to the measurement of the research output of the HEI and proposes a simple overall indicator which incorporates quantitative and qualitative aspects to permit the decomposition of the influence of the two factors. On the basis of this indicator homogeneous comparisons are made of the relative research output of the countries of the European Union and its evolution during the period 1996–2010.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Notes

  1. Many studies have demonstrated the positive effects upon regional economic development of the research activities of universities, especially in the case of North American universities (i.e. Goldstein and Renault 2004; O’Shea et al. 2005; Bramwell and Wolfe 2005). Pastor and Peraita (2012) offer a review of studies of the socioeconomic contribution of universities.

  2. This positive relationship exists also in terms of number of citations.

  3. Some studies (FCYD 2008; Salas 2012) propose the additional use of diverse indicators of the quality of university teaching, such as the drop-out rate, the performance rate, the student–teacher ratio, expenditure per student, the number of information technology (IT) and library staff per student, expenditure per student, the number of doctorates with an honourable mention, etc. At aggregate level, there also exist proposals for the contemplation of the differences and/or improvements in the quality of teaching activity through the use of salaries, under the assumption that, ceteris paribus, higher graduate salaries reflect a greater quality of the education received. On this question, see Mortensen et al. (2011).

  4. The research frontier may have shifted, the researcher’s attention might have moved on to other problems, it may be intellectually or psychologically challenging to start work on a delayed paper, etc. (Klitkou and Gulbrandsen 2010).

  5. Klitkou and Gulbrandsen (2010) state that in interviews, some academic inventors claim they cannot talk about their most recent research because the relevant patents have not yet been secured.

  6. Moreover, some researchers have indicated in interviews that patents are sometimes based on the first draft of a scientific paper and that the patent application is written by a specialised professional (Klitkou and Gulbrandsen 2010).

  7. A more detailed discussion about the complementarity or substitutability of publishing and patenting and their determinants is to be found in Salas (2012) and Crespia et al. (2011).

  8. The impact factor (IF) is an indicator that reflects the average number of citations of recent articles published in scientific journals (Garfield 2006). It is frequently used as a proxy for the relative importance of a journal within its field, with journals with a higher IF deemed to be more important than those with lower ones. IFs are calculated yearly for those journals indexed in the Thomson Reuters Journal Citation Reports. In a given year, the impact factor of a journal is the average number of citations received per paper published in that journal during the two preceding years. The SCImago Journal Rank (SJR) is a measure of the influence of scientific journals; it accounts for both the number of citations received by a journal and the importance or prestige of the journals from which such citations proceed. Falagas et al. (2008) offer a comparison between the IF and the SJR. See González-Pereira et al. (2010) for the advantages of the SJR indicator and a comparison with the IF indicator, as an alternative to the metric of journals’ scientific prestige. The Eigenfactor score (Bergstrom 2007) is an indicator of the importance of a scientific journal. Journals are rated according to the number of incoming citations, with citations from highly ranked journals weighted to make a larger contribution to the Eigenfactor than those from poorly ranked journals. The h-index (Hirsch 2005), is an indicator which attempts to measure the impact of a researcher’s published work. The index is based on the set of the researcher's most frequently cited papers and the number of citations they have received in other publications. The h-index is intended to simultaneously measure the quality and quantity of scientific output. Franceschini and Maisano (2011) propose a structured method to compare academic research groups within the same discipline, by means of some Hirsch (h) based bibliometric indicators. Sidiropoulos et al. (2007) have developed the “Generalized H-index” to resolve some of the problems of the h-index. This index is an alternative to the h/H-/H index proposed by Vieira and Gomes (2010). The nh3 index is designed to measure solely the impact of research, independently of the size of the institution. Bornmanna et al. (2010) also propose certain refinements to the h-index.

  9. A positive correlation between peer judgements and different citation-based indicators has been found (Rinia et al. 1998). Charlton and Andras (2007) suggest using the total citations of universities as a measure of output. According to these authors, this indicator has certain advantages compared with other indicators: it is cheap, quick, simple, transparent, objective, replicable and permits international and longitudinal comparisons.

  10. The information is available on the following website: http://www.scimagojr.com/countryrank.php.

  11. The Scopus database contains a larger number of journals and covers the humanities. It doubles the number of journals indexed compared with the WoS, which ensures a greater thematic and geographical coverage. Corera et al. (2010).

  12. For example, Moed et al. (2011) analyse relationships between university research performance and concentration using the SCImago database. They find that that a larger publication output is associated with a higher citation impact.

  13. SCImago is a Spanish research group constituted by the High Council for Scientific Research (Consejo Superior de Investigaciones Científicas [CSIC]) and the Universities of Granada, Extremadura, Carlos III (Madrid) and Alcalá de Henares; it is dedicated to information analysis, representation and retrieval by means of visualisation techniques.

References

  • Acosta, M., Coronado, D., Ferrándiz, E., & Dolores León, M. (2014). Regional scientific production and specialization in Europe: The role of HERD. European Planning Studies, 22(5), 949–974.

    Article  Google Scholar 

  • Agrawal, A., & Henderson, R. (2002). Putting patents in context: Exploring knowledge transfer from MIT. Management Science, 48(1), 44–60.

    Article  Google Scholar 

  • Azoulay, P., Ding, W., & Stuart, T. (2009). The impact of academic patenting on the rate, quality and direction of (public) research output. The Journal of Industrial Economics, 57(4), 637–676.

    Article  Google Scholar 

  • Bergstrom, C. T. (2007). Eigenfactor: Measuring the value and prestige of scholarly journals. College & Research Libraries News, 68(5), 314–316.

    Google Scholar 

  • Bornmanna, L., Mutza, R., & Daniel, H. (2010). The h index research output measurement: Two approaches to enhance its accuracy. Journal of Informetrics, 4, 407–414.

    Article  Google Scholar 

  • Bramwell, A., & Wolfe, D. A. (2005). Universities and regional economic development: The Entrepreneurial University of Waterloo. Canadian political science association annual conference, June 2–4, Ontario.

  • Braun, T., & Glänzel, W. (1990). United Germany: The new scientific superpower? Scientometrics, 19(5), 513–521.

    Article  Google Scholar 

  • Breschi, S., Lissoni, F., & Montobbio, F. (2007). The scientific productivity of academic inventors: New evidence from Italian data. Economics of Innovation and New Technology, 16(2), 101–118.

    Article  Google Scholar 

  • Breschi, S., Lissoni, F., & Montobbio, F. (2008). University patenting and scientific productivity: A quantitative study of Italian academic inventors. European Management Review, 5(2), 91–109.

    Article  Google Scholar 

  • Carayol, N. (2007). Academic incentives, research organization and patenting at A Large French University. Economics of Innovation and New Technology, 16(2), 119–138.

    Article  Google Scholar 

  • Center for Science and Technology Studies (CWTS) (2009). The Leiden ranking. Retrieved in November from http://www.cwts.nl/ranking/LeidenRankingWebSite.html.

  • Charlton, B. G., & Andras, P. (2007). Evaluating universities using simple scientometric research-output metrics: Total citation counts per university for a retrospective seven-year rolling simple. Science and Public Policy, 34(8), 555–563.

    Article  Google Scholar 

  • Corera, E., Chinchilla, Z., De-Moya, F., & Sanz-Menéndez, L. (2010). Producción científica e impacto: ranking general y por áreas de las instituciones universitarias españolas. Informe CyD 2009 (pp. 254–262). Barcelona: Fundación CyD.

    Google Scholar 

  • Crespia, G., D’Este, P., Fontanac, R., & Geuna, A. (2011). The impact of academic patenting on university research and its transfer. Research Policy, 40, 55–68.

    Article  Google Scholar 

  • Czarnitzki, D., Glänzel, W., & Hussinger, K. (2007). Patent and publication activities of German professors: An empirical assessment of their co-activity. Research Evaluation, 16(4), 311–319.

    Article  Google Scholar 

  • Czarnitzki, D., Glänzel, W., & Hussinger, K. (2009). Heterogeneity of patenting activity and its implications for scientific research. Research Policy, 38(2009), 26–34.

    Article  Google Scholar 

  • Fabrizio, K. R., & DiMinin, A. (2005). Commercializing the laboratory: Faculty patenting and the open science environment. Research Policy, 37, 914–931.

    Article  Google Scholar 

  • Falagas, M. E., Kouranos, V. D., Arencibia-Jorge, R., & Karageorgopoulos, D. E. (2008). Comparison of SCImago journal rank indicator with journal impact factor. The FASEB Journal, 22(22), 2623–2628.

    Article  Google Scholar 

  • Franceschini, F., & Maisano, D. (2011). Structured evaluation of the scientific output of academic research groups by recent h-based indicators. Journal of Informetrics, 5, 64–74.

    Article  Google Scholar 

  • Garfield, E. (2006). The history and meaning of the journal impact factor. Jama, 295(1), 90–93.

  • Glänzel, W., Thijs, B., Schubert, A., & Debackere, K. (2009). Subfield-specific normalized relative indicators and a new generation of relational charts: Methodological foundations illustrated on the assessment of institutional research performance. Scientometrics, 78(1), 165–188.

    Article  Google Scholar 

  • Goldstein, H. A., & Renault, C. S. (2004). Contributions of universities to regional economic development: A quasi-experimental approach. Regional Studies, 38, 733–746.

    Article  Google Scholar 

  • González-Albo, B., Moreno, L., Morillo, F., & Bordons, M. (2012). Bibliometric indicators for the analysis of the research performance of a multidisciplinary institution: The CSIC. Revista Española de Documentación Científica, 35(1), 9–37.

    Article  Google Scholar 

  • González-Pereira, B., Guerrero-Bote, V. P., & Moya-Anegón, F. (2010). A new approach to the metric of journals’ scientific prestige: The SJR indicator. Journal of Informetrics, 4(2010), 379–391.

    Article  Google Scholar 

  • Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. PNAS, 102(46), 16569–16572.

    Article  Google Scholar 

  • IEDCYT (Instituto de Estudios Documentales sobre Ciencia y Tecnología), CCHS (Centro de Ciencias Humanas y Sociales) and CSIC (Consejo Superior de Investigaciones Económicas) (2009). La actividad científica del CSIC a través del Web of Science. Estudio bibliométrico del período 20002007. Madrid. Available at: http://www.cindoc.csic.es.

  • King, D. (2004). The scientific impact of nations. What different countries get for their research spending. Nature, 432(4), 311–316.

  • Klitkou, A., & Gulbrandsen, M. (2010). The relationship between academic patenting and scientific publishing in Norway. Scientometrics, 82(1), 93–108.

    Article  Google Scholar 

  • Moed, H. F. (2005). Citation analysis in research evaluation. Dordrecht: Springer.

    Google Scholar 

  • Moed, H. F., Moya-Anegón, F., López-Illescas, C., & Visser, M. (2011). Is concentration of university research associated with better research performance? Journal of Informetrics, 5, 649–658.

    Article  Google Scholar 

  • Mortensen, J., O’Mahony, M., Pastor, J. M., Serrano, L., & Stokes, L. (2011). Measuring education input, output and outcomes: State of the art and data availability. INDICSER review paper No. 4, European Commission INDICSER project indicators for evaluating international performance in service sectors.

  • O’Shea, R. P., Allen, T. J., Chevalier, A., & Roche, F. (2005). Entrepreneurial orientation, technology transfer and spinoff performance of U.S. Universities. Research Policy, 34, 994–1009.

    Article  Google Scholar 

  • Pastor, J. M., & Peraita, C. (2012). La contribución socioeconómica del sistema universitario español. Ministerio de Educación (forthcoming).

  • Rey, O. (2009). Quality Indicators and Educational Research publications: Which publications count? Dossier d’actualité No. 46–June–July. Service de Veille scientifique et technologique, L’Institut Français de l’Éducation Available at: http://ife.ens-lyon.fr/vst/DA/detailsDossier.php?parent=accueil&dossier=46&lang=en.

  • Rinia, E. J., van Leeuwen, T. N., van Vuren, H. G., & van Raan, A. F. J. (1998). Comparative analysis of a set of bibliometric indicators and central peer review criteria: Evaluation of condensed matter physics in the Netherlands. Research Policy, 27(1), 95–107.

    Article  Google Scholar 

  • Salas, V. (2012). La producción universitaria. El caso chileno. Departamento de Economía. Universidad de Santiago de Chile.

  • SCImago (2012). SIR world report 2012. Global ranking. SCIMAGO Institutions Rankings (SIR). Available at: http://www.scimagoir.com.

  • SCImago (2012). SJR: SCImago Journal & Country Rank. Retrieved July 2012 from http://www.scimagojr.com.

  • Sidiropoulos, A., Katsaros, D., & Manolopoulos, Y. (2007). Generalized Hirsch h-index for disclosing latent facts in citation networks. Scientometrics, 72(2), 253–280.

    Article  Google Scholar 

  • Stephan, P., Gurmu, S., Sumell, A. J., & Black, G. (2007). Who’s patenting in the university? Economics of Innovation and New Technology, 16(2), 71–99.

    Article  Google Scholar 

  • Thijs, B., & Glänzel, W. (2009). A structural analysis of benchmarks on different bibliometrical indicators for European research institutes based on their research profile. Scientometrics, 79(2), 377–388.

    Article  Google Scholar 

  • Van Looy, B., Callaert, J., & Debackere, K. (2006). Publication and patent behaviour of academic researchers: Conflicting, reinforcing or merely co-existing? Research Policy, 35(4), 596–609.

    Article  Google Scholar 

  • Vieira, E. S., & Gomes, J. A. N. F. (2010). A research impact indicator for institutions. Journal of Informetrics, 4, 581–590.

    Article  Google Scholar 

  • Vieira, E. S., Nouws, H. P. A., Albergaria, J. T., Matos, C. D., & Gomes, J. A. N. F. (2009). Research quality indicators for Brazilian, Portuguese and Spanish Universities. In 12th international conference on scientometrics and informetrics, 14–17 July, Rio de Janeiro, Brazil.

  • Vinkler, P. (1986). Evaluation of some methods for the relative assessment of scientific publications. Scientometrics, 10(3), 157–177.

    Article  Google Scholar 

Download references

Acknowledgments

This paper was developed as part of the INDICSER project funded by the European Commission, Research Directorate General, as part of the 7th Framework Programme, Theme 8: Socio-Economic Sciences and Humanities. Grant Agreement No: 244 709. José Manuel Pastor and Lorenzo Serrano wish to thank the Spanish Ministry of Science and Innovation for its financial support (ECO2011-23248).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to José Manuel Pastor.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Pastor, J.M., Serrano, L. & Zaera, I. The research output of European higher education institutions. Scientometrics 102, 1867–1893 (2015). https://doi.org/10.1007/s11192-014-1509-y

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-014-1509-y

Keywords

Navigation