Abstract
Journals are increasingly making use of online supplemental information (OSI) as a means to convey part of the material previously included in the papers themselves. Quite often, material displaced to OSI is accompanied by references that, with rare exceptions, are not incorporated into citation databases. An analysis of OSI in a random sample of papers published in 2013 in the Proceedings of the National Academy of Sciences of the USA revealed that unique references only listed in OSI amount to more than 10 % of the number of references included in the papers themselves. Obliteration of these references in citation databases contributes to substantial inaccuracies in citation counts, with a bias against papers that are cited only in the methods sections usually displaced to OSI.
Similar content being viewed by others
References
Aguillo, I. F. (2012). Is Google Scholar useful for bibliometrics? A webometric analysis. Scientometrics, 91(2), 343–351. doi:10.1007/s11192-011-0582-8.
Diem, A., & Wolter, S. C. (2013). The use of bibliometrics to measure research performance in education sciences. Research in Higher Education, 54(1), 86–114. doi:10.1007/s11162-012-9264-5.
García-Pérez, M. A. (2010). Accuracy and completeness of publication and citation records in the Web of Science, PsycINFO, and Google Scholar: A case study for the computation of h indices in psychology. Journal of the American Society for Information Science and Technology, 61(10), 2070–2085. doi:10.1002/asi.21372.
García-Pérez, M. A. (2011). Strange attractors in the Web of Science database. Journal of Informetrics, 5(1), 214–218. doi:10.1016/j.joi.2010.07.006.
Harzing, A.-W. (2013). A preliminary test of Google Scholar as a source for citation data: A longitudinal study of Nobel prize winners. Scientometrics, 94(3), 1057–1075. doi:10.1007/s11192-012-0777-7.
Heneberg, P. (2013). Lifting the fog of scientometric research artifacts: On the scientometric analysis of environmental tobacco smoke research. Journal of the American Society for Information Science and Technology, 64(2), 334–344. doi:10.1002/asi.22753.
Henzinger, M., Suñol, J., & Weber, I. (2010). The stability of the h-index. Scientometrics, 84(2), 465–479. doi:10.1007/s11192-009-0098-7.
Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences of the USA, 102(46), 16569–16572. doi:10.1073/pnas.0507655102.
Jacsó, P. (2008). Testing the calculation of a realistic h-index in Google Scholar, Scopus, and Web of Science for F.W. Lancaster. Library Trends, 56(4), 784–815. doi:10.1353/lib.0.0011.
Numerical Algorithms Group. (1999). NAG Fortran library manual, mark 19. Oxford: Numerical Algorithms Group.
Panaretos, J., & Malesios, C. (2009). Assessing scientific research performance and impact with single indices. Scientometrics, 81(3), 635–670. doi:10.1007/s11192-008-2174-9.
Rosenberg, M. S. (2013). A biologist’s guide to impact factors. Retrieved May 9, 2014, from http://www.rosenberglab.net/Pubs/Rosenberg2013_ImpactFactor.pdf.
Acknowledgments
This research was supported by Grant PSI2012-32903 (Ministerio de Economía y Competitividad, Spain).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
García-Pérez, M.A. Online supplemental information: a sizeable black hole for citations. Scientometrics 102, 1655–1659 (2015). https://doi.org/10.1007/s11192-014-1348-x
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-014-1348-x