Abstract
The h-index, or Hirsch index, named after Jorge E. Hirsch, is one of the few author-based metrics currently available that offers a perspective of the productivity and citation impact of a scientist, researcher, or academic. There are four tools most commonly used to calculate the h-index, all of which depend on separate databases: Scopus, Web of Knowledge, Google Scholar, and ResearchGate. Using the h-index of the authors of this paper derived from these four sources, it is abundantly clear that scores vary widely and that it is unclear which of these sources is a reliable or accurate source of information, for any purpose. As the use and application of author-based metrics increases, including for official academic purposes, it is becoming increasingly important to know which source of the h-index is most accurate, and thus valid. Although this is not a review of the h-index, some perspectives are provided of the h-index-related literature to place this case study within a wider context of the weaknesses and criticisms of using the h-index as a metric to evaluate scientific outcome.
Similar content being viewed by others
Notes
https://www.researchgate.net/profile/Jaime_Teixeira_Da_Silva (this represents the most accurate and up-to-date database of publications for the first author, curated by the first author).
https://www.researchgate.net/profile/Judit_Dobranszki (this represents the most accurate and up-to-date database of publications for the first author, curated by the second author).
http://clarivate.libguides.com/webofscienceplatform/coverage (coverage: 138 million records (journals, books, and proceedings)).
https://www.elsevier.com/__data/assets/pdf_file/0007/69451/0597-Scopus-Content-Coverage-Guide-US-LETTER-v4-HI-singles-no-ticks.pdf (coverage: 69 million core records + 147 million non-core records or documents).
https://scholar.google.com/citations?user=StDfMFQAAAAJ&hl=en (approximately 35 papers previously listed, with one duplicate, now showing 15 after correction; compare to 20 papers on RG account: https://www.researchgate.net/profile/Leonid_Schneider). He was accused of fraud on Twitter, claiming initially that he would not clean up his GS profile: https://twitter.com/schneiderleonid/status/906080994758930432 (“I've better things to do than daily check stuff Google Scholar adds to my profile. Now that I was accused of fraud, I sure won't clean it up”); https://twitter.com/schneiderleonid/status/906064401274470400 (“I won't be deleting false papers, since I never use Google Scholar as my official publications list reference. Let's see how this continues.”).
Abbreviations
- ABM:
-
Author-based metric
- GS:
-
Google Scholar
- h-index:
-
Hirsch index
- SRA:
-
Scientist, researcher, or academic
- WoS:
-
Web of Science
References
Adriaansee, S. L., & Rensleigh, C. (2013). Web of Science, Scopus and Google Scholar: A content comprehensiveness comparison. The Electronic Library, 31(6), 727–744. https://doi.org/10.1108/EL-12-2011-0174.
Bar-Ilan, J. (2008). Which h-index?—A comparison of WoS, Scopus and Google Scholar. Scientometrics, 74(2), 257–271. https://doi.org/10.1007/s11192-008-0216-y.
Bornmann, L., & Daniel, H.-D. (2009). The state of h index research. Is the h index the ideal way to measure research performance? EMBO Reports, 10(1), 2–6. https://doi.org/10.1038/embor.2008.233.
Costas, R., & Bordons, M. (2007). The h-index: Advantages, limitations and its relation with other bibliometric indicators at the micro level. Journal of Infometrics, 1, 193–203. https://doi.org/10.1016/j.joi.2007.02.001.
Flatt, J. W., Blassime, A., & Vayena, E. (2017). Improving the measurement of scientific success by reporting a self-citation index. Publications, 5, 20. https://doi.org/10.3390/publications5030020.
Glänzel, W., & Persson, O. (2005). H-index for price medallists. ISSI Newsletter, 1(4), 15–18.
Halevi, G., Moed, H., & Bar-Ilan, J. (2017). Suitability of Google Scholar as a source of scientific information and as a source of data for scientific evaluation—Review of the literature. Journal of Infometrics, 11(3), 823–834. https://doi.org/10.1016/j.joi.2017.06.005.
Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences USA, 102(46), 16569–16572. https://doi.org/10.1073/pnas.0507655102.
Lippi, G., & Mattiuzzi, C. (2017). Scientist impact factor (SIF): A new metric for improving scientists’ evaluation? Annals of Translational Medicine, 5(15), 303. https://doi.org/10.21037/atm.2017.06.24.
Popova, O., Romanov, D., Drozdov, A., & Gerashchenko, A. (2017). Citation-based criteria of the significance of the research activity of scientific teams. Scientometrics, 112(3), 1179–1202. https://doi.org/10.1007/s11192-017-2427-6.
Saraykar, S., Saleh, A., & Selek, S. (2017). The association between NIMH funding and h-index in psychiatry. Academic Psychiatry, 41, 455–459. https://doi.org/10.1007/s40596-016-0654-4.
Svider, P. F., Husain, Q., Folbe, A. J., Couldwell, W. T., Liu, J. K., & Eloy, J. A. (2014). Assessing national institutes of health funding and scholarly impact in neurological surgery. Journal of Neurosurgery, 120(1), 191–196. https://doi.org/10.3171/2013.8.JNS13938.
Teixeira da Silva, J. A. (2013). The global science factor v. 1.1: A new system for measuring and quantifying quality in science. The Asian and Australasian Journal of Plant Science and Biotechnology, 7(Special Issue 1), 92–101.
Teixeira da Silva, J. A. (2016). Science watchdogs. Academic Journal of Interdisciplinary Studies, 5(3), 13–15. https://doi.org/10.5901/ajis.2016.v5n3p13.
Teixeira da Silva, J. A. (2017). The journal impact factor (JIF): Science publishing’s miscalculating metric. Academic Questions, 30(4), 433–441.
Teixeira da Silva, J. A., & Bernès, S. (2017). Clarivate analytics: Continued omnia vanitas impact factor culture. Science and Engineering Ethics. https://doi.org/10.1007/s11948-017-9873-7. (in press).
Teixeira da Silva, J. A., & Bornemann-Cimenti, H. (2017). Why do some retracted papers continue to be cited? Scientometrics, 110(1), 365–370. https://doi.org/10.1007/s11192-016-2178-9.
Teixeira da Silva, J. A., & Dobránszki, J. (2017). Highly cited retracted papers. Scientometrics, 110(3), 1653–1661. https://doi.org/10.1007/s11192-016-2227-4.
Author information
Authors and Affiliations
Corresponding authors
Ethics declarations
Conflict of interest
The authors declared that they have no conflict of interest.
Rights and permissions
About this article
Cite this article
Teixeira da Silva, J.A., Dobránszki, J. Multiple versions of the h-index: cautionary use for formal academic purposes. Scientometrics 115, 1107–1113 (2018). https://doi.org/10.1007/s11192-018-2680-3
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-018-2680-3