Abstract
The transition to quantitative methods of assessing scientific performance in Russian science in the early 2010s led to a sharp increase in publication activity. It has been accompanied by inflation—an outstripping increase in the publications number due to not always justified collaborations that allow results to be credited to several organisations and a decrease in the average quality of the publication flow. Growth has been achieved mainly through publications in conference proceedings and low-rated journals. Since 2020, the Ministry of Science and Higher Education of the Russian Federation has applied a new assessment methodology that implements the model of collaboration penalty and quartile bonuses: fractional count of publications and multiplying factors for ones in high-impact journals. The analysis carried out as part of this study shows that the primary beneficiaries of this transformation are academic institutions specializing in Chemistry, Materials Science, and Life Sciences. Organisations traditionally strong in Physics and Astronomy are somewhat inferior to the leaders, mainly due to active participation in mega-collaborations. A separate scale of quality factors gives unjustified advantages to social science organisations, which may exacerbate the number of scientific ethics breaches. In general, the new system should help mitigate distortions in the development of Russian science. However, some side effects of its implementation need increased attention and will probably require adjustments in the methodology.
Similar content being viewed by others
References
Abbott, A., Cyranoski, D., Jones, N., Maher, B., Schiermeier, Q., & Van Noorden, R. (2010). Metrics: Do metrics matter? Nature, 465(7300), 860–862. https://doi.org/10.1038/465860a
Abramo, G., D’Angelo, A. C., & Murgia, G. (2017). The relationship among research productivity, research collaboration, and their determinants. Journal of Informetrics. https://doi.org/10.1016/j.joi.2017.09.007
Abramo, G., D’Angelo, C. A., & Rosati, F. (2013). Measuring institutional research productivity for the life sciences: The importance of accounting for the order of authors in the byline. Scientometrics, 97(3), 779–795. https://doi.org/10.1007/s11192-013-1013-9
Abramo, G., D’Angelo, C. A., & Solazzi, M. (2011). Are researchers that collaborate more at the international level top performers? An investigation on the Italian university system. Journal of Informetrics. https://doi.org/10.1016/j.joi.2010.11.002
Abt, H. A. (2007). The frequencies of multinational papers in various sciences. Scientometrics. https://doi.org/10.1007/s11192-007-1686-z
Agarwal, A., Durairajanayagam, D., Tatagari, S., Esteves, S. C., Harlev, A., Henkel, R., Roychoudhury, S., Homa, S., Puchalt, N. G., Ramasamy, R., Majzoub, A., Dao Ly, K., Tvrda, E., Assidi, M., Kesari, K., Sharma, R., Banihani, S., Ko, E., Abu-Elmagd, M., … Bashiri, A. (2016). Bibliometrics: Tracking research impact by selecting the appropriate metrics. Asian Journal of Andrology, 18(2), 296–309. https://doi.org/10.4103/1008-682X.171582
Ascheulova, N. A., & Kolchinsky, E. I. (2010). The reform of science in Russia (a historico-sociological analysis). Voprosy Istorii Estestvoznaniia I Tekhniki [studies in the History of Science and Technology], 31(1), 95–119.
Bhattacharjee, Y. (2011). Saudi universities offer cash in exchange for academic prestige. Science, 334(6061), 1344–1345. https://doi.org/10.1126/science.334.6061.1344
Binswanger, M. (2015). How nonsense became excellence: Forcing professors to publish. In I. M. Welpe, J. Wollersheim, S. Ringelhan, & M. Osterloh (Eds.), Incentives and performance: Governance of research organizations. Cham: Springer International Publishing. https://doi.org/10.1007/978-3-319-09785-5_2
Bornmann, L. (2018). Which research institution performs better than average in a subject category or better than selected other institutions? Online Information Review, 42(2), 222–237. https://doi.org/10.1108/OIR-08-2015-0276
Chankseliani, M., Lovakov, A., & Pislyakov, V. (2021). A big picture: Bibliometric study of academic publications from post-Soviet countries. Scientometrics, 126(10), 8701–8730. https://doi.org/10.1007/s11192-021-04124-5
De Solla Price, D. J., & Beaver, D. (1966). Collaboration in an invisible college. American Psychologist, 21(11), 1011–1018. https://doi.org/10.1037/h0024051
Egghe, L., Rousseau, R., & Hooydonk, G. V. (2000). Methods for accrediting publications to authors or countries: Consequences for evaluation studies. Journal of the American Society for Information Science, 51(2), 145–157. https://doi.org/10.1002/(SICI)1097-4571(2000)51:2%3c145::AID-ASI6%3e3.0.CO;2-9
Erokhina, E. (2019). Russian Science in Scopus and WoS - Quality or Quantity? Indicator.ru. Retrieved August 29, 2021, from https://indicator.ru/engineering-science/rossijskaya-nauka-v-scopus-i-wos-kolichestvo-ili-kachestvo.htm
Fanelli, D. (2010). Do pressures to publish increase scientists’ bias? An empirical support from US states data. PLoS ONE. https://doi.org/10.1371/journal.pone.0010271
Guskov, A., Kosyakov, D., & Selivanova, I. (2016). Scientometric research in Russia: Impact of science policy changes. Scientometrics. https://doi.org/10.1007/s11192-016-1876-7
Guskov, A. E., Kosyakov, D. V., & Selivanova, I. V. (2018). Boosting research productivity in top Russian universities: The circumstances of breakthrough. Scientometrics, 117(2), 1053–1080. https://doi.org/10.1007/s11192-018-2890-8
Halperin, E. C. (1999). Publish or perish—And bankrupt the medical library while we’re at it. Academic Medicine, 74(5), 470–472. https://doi.org/10.1097/00001888-199905000-00009
Hottenrott, H., Rose, M. E., & Lawson, C. (2021). The rise of multiple institutional affiliations in academia. Journal of the Association for Information Science and Technology. https://doi.org/10.1002/asi.24472
Huang, M. H., Lin, C. S., & Chen, D. Z. (2011). Counting methods, country rank changes, and counting inflation in the assessment of national research productivity and impact. Journal of the American Society for Information Science and Technology, 62(12), 2427–2436. https://doi.org/10.1002/asi.21625
Ioannidis, J. P. A., Klavans, R., & Boyack, K. W. (2018). Thousands of scientists publish a paper every five days. Nature, 561(7722), 167–169. https://doi.org/10.1038/d41586-018-06185-8
Kosyakov, D., & Guskov, A. (2019c). Synchronous scientific mobility and international collaboration: Case of Russia. In 17th International Conference on Scientometrics and Informetrics, ISSI 2019c—Proceedings (Vol. 1, pp. 1319–1328).
Kosyakov, D., & Guskov, A. (2019a). Impact of national science policy on academic migration and research productivity in Russia. Procedia Computer Science, 146, 60–71. https://doi.org/10.1016/j.procs.2019.01.080
Kosyakov, D., & Guskov, A. (2019b). Research assessment and evaluation in Russian fundamental science. Procedia Computer Science, 146, 11–19. https://doi.org/10.1016/j.procs.2019.01.072
Larivière, V., & Costas, R. (2016). How many is too many? On the relationship between research productivity and impact. PLoS ONE. https://doi.org/10.1371/journal.pone.0162709
Leydesdorff, L., & Park, H. W. (2017). Full and fractional counting in bibliometric networks. Journal of Informetrics, 11(1), 117–120. https://doi.org/10.1016/j.joi.2016.11.007
Lin, C. S., Huang, M. H., & Chen, D. Z. (2013). The influences of counting methods on university rankings based on paper count and citation count. Journal of Informetrics, 7(3), 611–621. https://doi.org/10.1016/j.joi.2013.03.007
Macháček, V., & Srholec, M. (2019). Predatory publications in Scopus: Evidence on cross-country differences. 1, 351–362. Scopus.
Manganote, E. J. T., Schulz, P. A., & de Brito Cruz, C. H. (2016). Effect of high energy physics large collaborations on higher education institutions citations and rankings. Scientometrics, 109(2), 813–826. https://doi.org/10.1007/s11192-016-2048-5
Marina, T., & Sterligov, I. (2021). Prevalence of potentially predatory publishing in Scopus on the country level. Scientometrics, 126(6), 5019–5077.
Matveeva, N., Sterligov, I., & Yudkevich, M. (2021). The effect of Russian University Excellence Initiative on publications and collaboration patterns. Journal of Informetrics, 15(1), 101110. https://doi.org/10.1016/j.joi.2020.101110
Moed, H. F. (2005). Citation analysis in research evaluation. In proceedings of ISSI 2005: 10th international conference of the international society for scientometrics and informetrics, (Vol. 2, pp. 437–441). https://doi.org/10.1007/1-4020-3714-7
Moed, H. F., & Halevi, G. (2015). Multidimensional assessment of scholarly research impact. Journal of the Association for Information Science and Technology, 66(10), 1988–2002. https://doi.org/10.1002/asi.23314
Moed, H. F., Markusova, V., & Akoev, M. (2018). Trends in Russian research output indexed in scopus and web of science. Scientometrics, 116(2), 1153–1180. https://doi.org/10.1007/s11192-018-2769-8
Parfenova, S. L., Grishakina, E. G., Zolotarev, D. V., & Bogatov, V. V. (2017). Publication landscape of the Russian science. Science Governance and Scientometrics, 1(23), 53–79.
Perianes-Rodriguez, A., Waltman, L., & van Eck, N. J. (2016). Constructing bibliometric networks: A comparison between full and fractional counting. Journal of Informetrics, 10(4), 1178–1195. https://doi.org/10.1016/j.joi.2016.10.006
Selivanova, I. V., Kosyakov, D. V., & Guskov, A. E. (2019). The impact of errors in the scopus database on the research assessment. Scientific and Technical Information Processing, 46(3), 204–212. https://doi.org/10.3103/S0147688219030109
Sivertsen, G., Rousseau, R., & Zhang, L. (2019). Measuring scientific contributions with modified fractional counting. Journal of Informetrics, 13(2), 679–694. https://doi.org/10.1016/j.joi.2019.03.010
Tarkhan-Mouravi, S. (2020). Traditional indicators inflate some countries’ scientific impact over 10 times. Scientometrics, 123(1), 337–356. https://doi.org/10.1007/s11192-020-03372-1
Wager, E., Singhvi, S., & Kleinert, S. (2015). Too much of a good thing? An observational study of prolific authors. PeerJ. https://doi.org/10.7717/peerj.1154
Waltman, L., Calero-Medina, C., Kosten, J., Noyons, E. C. M., Tijssen, R. J. W., van Eck, N. J., van Leeuwen, T. N., van Raan, A. F. J., Visser, M. S., & Wouters, P. (2012). The Leiden ranking 2011/2012: Data collection, indicators, and interpretation. Journal of the American Society for Information Science and Technology, 63(12), 2419–2432. https://doi.org/10.1002/asi.22708
Waltman, L., & van Eck, N. J. (2015). Field-normalized citation impact indicators and the choice of an appropriate counting method. Journal of Informetrics, 9(4), 872–894. https://doi.org/10.1016/j.joi.2015.08.001
Wuchty, S., Jones, B. F., & Uzzi, B. (2007). The increasing dominance of teams in production of knowledge. Science. https://doi.org/10.1126/science.1136099
Funding
This study was supported by the Ministry of Science and Higher Education of the Russian Federation, projects no. 0334–2019-006, FZFM-2022–0001.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors have no conflict of interest to declare that are relevant to the content of this article.
Ethical approval
The research did not involve human participants.
Rights and permissions
About this article
Cite this article
Kosyakov, D., Guskov, A. Reasons and consequences of changes in Russian research assessment policies. Scientometrics 127, 4609–4630 (2022). https://doi.org/10.1007/s11192-022-04469-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-022-04469-5