Skip to main content
Log in

Two new kids on the block: How do Crossref and Dimensions compare with Google Scholar, Microsoft Academic, Scopus and the Web of Science?

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

In the last 3 years, several new (free) sources for academic publication and citation data have joined the now well-established Google Scholar, complementing the two traditional commercial data sources: Scopus and the Web of Science. The most important of these new data sources are Microsoft Academic (2016), Crossref (2017) and Dimensions (2018). Whereas Microsoft Academic has received some attention from the bibliometric community, there are as yet very few studies that have investigated the coverage of Crossref or Dimensions. To address this gap, this brief letter assesses Crossref and Dimensions coverage in comparison to Google Scholar, Microsoft Academic, Scopus and the Web of Science through a detailed investigation of the full publication and citation record of a single academic, as well as six top journals in Business & Economics. Overall, this first small-scale study suggests that, when compared to Scopus and the Web of Science, Crossref and Dimensions have a similar or better coverage for both publications and citations, but a substantively lower coverage than Google Scholar and Microsoft Academic. If our findings can be confirmed by larger-scale studies, Crossref and Dimensions might serve as good alternatives to Scopus and the Web of Science for both literature reviews and citation analysis. However, Google Scholar and Microsoft Academic maintain their position as the most comprehensive free sources for publication and citation data.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Notes

  1. Our Publish or Perish Dimensions searches for the three names we tested (Eugene Garfield, Blaise Cronin and Mike Thelwall) produced citation levels that were (slightly) above those reported for Scopus in this article. The lack of coverage for author searches in Dimensions reported in this article might thus have been caused by a problem in the author disambiguation system that has since been resolved.

  2. Our focus on the Social Sciences might limit generalization as the two commercial databases are less comprehensive for the Social Sciences, Arts & Humanities and to a lesser extent Engineering than for the Sciences and Life Sciences (for a detailed analysis, see Harzing and Alakangas 2016, 2017a). Hence, differences in coverage between the six data sources might be smaller for the Sciences and Life Sciences.

References

  • Fairhurst, V. (2019). The first year of the Crossref Ambassador Program: Highlights and challenges. Science Editing, 6(1), 58–63.

    Article  Google Scholar 

  • Harzing, A. W. (2007) Publish or Perish. https://harzing.com/resources/publish-or-perish. Accessed 3 Apr 2019.

  • Harzing, A. W. (2016). Microsoft Academic (Search): A Phoenix arisen from the ashes? Scientometrics, 108(3), 1637–1647.

    Article  Google Scholar 

  • Harzing, A. W., & Alakangas, S. (2016). Google Scholar, Scopus and the Web of Science: A longitudinal and cross-disciplinary comparison. Scientometrics, 106(2), 787–804.

    Article  Google Scholar 

  • Harzing, A. W., & Alakangas, S. (2017a). Microsoft Academic: Is the phoenix getting wings? Scientometrics, 110(1), 371–383.

    Article  Google Scholar 

  • Harzing, A. W., & Alakangas, S. (2017b). Microsoft Academic is one year old: The Phoenix is ready to leave the nest. Scientometrics, 112(3), 1887–1894.

    Article  Google Scholar 

  • Hug, S. E., & Brandle, M. P. (2017). The coverage of Microsoft Academic: Analyzing the publication output of a university. Scientometrics, 113(3), 1551–1571.

    Article  Google Scholar 

  • Hug, S. E., Ochsner, M., & Brandle, M. P. (2017). Citation analysis with microsoft academic. Scientometrics, 111(1), 371–378.

    Article  Google Scholar 

  • Martin-Martin, A., Orduna-Malea, E., Thelwall, M., & Delgado-López-Cózar, E. (2018). Google Scholar, Web of Science, and Scopus: A systematic comparison of citations in 252 subject categories. Journal of Informetrics, 12(4), 1160–1177.

    Article  Google Scholar 

  • Orduña-Malea, E., & Delgado-López-Cózar, E. (2018). Dimensions: Re-discovering the ecosystem of scientific information. El Profesional de la Información, 27(2), 420–431.

    Article  Google Scholar 

  • Thelwall, M. (2017). Microsoft Academic: A multidisciplinary comparison of citation counts with Scopus and Mendeley for 29 journals. Journal of Informetrics, 11(4), 201–1212.

    Google Scholar 

  • Thelwall, M. (2018a). Microsoft Academic automatic document searches: Accuracy for journal articles and suitability for citation analysis. Journal of Informetrics, 12(1), 1–9.

    Article  Google Scholar 

  • Thelwall, M. (2018b). Dimensions: A competitor to Scopus and the Web of Science? Journal of Informetrics, 12(2), 430–435.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anne-Wil Harzing.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Harzing, AW. Two new kids on the block: How do Crossref and Dimensions compare with Google Scholar, Microsoft Academic, Scopus and the Web of Science?. Scientometrics 120, 341–349 (2019). https://doi.org/10.1007/s11192-019-03114-y

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-019-03114-y

Keywords

Navigation