skip to main content
10.1145/2467696.2467715acmconferencesArticle/Chapter ViewAbstractPublication PagesjcdlConference Proceedingsconference-collections
research-article

Aggregating productivity indices for ranking researchers across multiple areas

Published: 22 July 2013 Publication History

Abstract

The impact of scientific research has traditionally been quantified using productivity indices such as the well-known h-index. On the other hand, different research fields---in fact, even different research areas within a single field---may have very different publishing patterns, which may not be well described by a single, global index. In this paper, we argue that productivity indices should account for the singularities of the publication patterns of different research areas, in order to produce an unbiased assessment of the impact of scientific research. Inspired by ranking aggregation approaches in distributed information retrieval, we propose a novel approach for ranking researchers across multiple research areas. Our approach is generic and produces cross-area versions of any global productivity index, such as the volume of publications, citation count and even the h-index. Our thorough evaluation considering multiple areas within the broad field of Computer Science shows that our cross-area indices outperform their global counterparts when assessed against the official ranking produced by CNPq, the Brazilian National Research Council for Scientific and Technological Development. As a result, this paper contributes a valuable mechanism to support the decisions of funding bodies and research agencies, for example, in any research assessment effort.

References

[1]
B. M. Althouse, J. D. West, C. T. Bergstrom, and T. Bergstrom. Differences in impact factor across fields and over time. JASIST, 60(1):27--34, 2009.
[2]
S. D. J. Barbosa and C. S. de Souza. INTERACTING WITH PUBLIC POLICY: Are HCI researchers an endangered species in Brazil? ACM Interactions Magazine, 18(3):69--71, 2011.
[3]
J. Bollen, H. V. de Sompel, A. A. Hagberg, and R. Chute. A principal component analysis of 39 scientific impact measures. CoRR, abs/0902.2183, 2009.
[4]
L. Bornmann and H.-D. Daniel. Universality of Citation Distributions - A Validation of Radicchi et al.'s Relative Indicator cf = c/c0 at the Micro Level Using Data From Chemistry. JASIST, 60(8):1664--1670, 2009.
[5]
L. Bornmann, R. Mutz, and H.-D. Daniel. Are there better indices for evaluation purposes than the h index? A comparison of nine different variants of the h index using data from biomedicine. JASIST, 59(5):830--837, 2008.
[6]
Q. L. Burrell. Hirsch's index: A stochastic model. J. Informetrics, 1(1):16--25, 2007.
[7]
J. Callan. Distributed information retrieval. In W. B. Croft, editor, Advances in Information Retrieval, chapter 5, pages 127--150. Kluwer Academic Publishers, 2000.
[8]
C. L. A. Clarke, N. Craswell, and I. Soboroff. Overview of the TREC 2009 Web track. In Procs. of TREC, Gaithersburg, MD, USA, 2009.
[9]
J. Claro and C. A. V. Costa. A made-to-measure indicator for cross-disciplinary bibliometric ranking of researchers performance. Scientometrics, 86(1):113--123, 2011.
[10]
R. G. Cota, A. A. Ferreira, C. Nascimento, M. A. Gonçalves, and A. H. F. Laender. An unsupervised heuristic-based hierarchical method for name disambiguation in bibliographic citations. JASIST, 61(9):1853--1870, 2010.
[11]
L. Egghe. Theory and practise of the g-index. Scientometrics, 69(1):131--152, 2006.
[12]
A. A. Ferreira, M. A. Gonçalves, and A. H. F. Laender. A brief Survey of Automatic Methods for Author Name Disambiguation. SIGMOD Record, 41(2):15--26, 2012.
[13]
F. Franceschini and D. A. Maisano. Proposals for evaluating the regularity of a scientist's research output. Scientometrics, 88(1):279--295, 2011.
[14]
V. P. Freire and D. R. Figueiredo. Ranking in collaboration networks using a group based metric. J. Braz. Comp. Soc., 17(4):255--266, 2011.
[15]
W. Gl\"anzel and A. Schubert. A new classification scheme of science fields and subfields designed for scientometric evaluation purposes. Scientometrics, 56(3):357--367, 2003.
[16]
J. E. Hirsch. An index to quantify an individual's scientific research output. PNAS, 102(46):16569--16572, 2005.
[17]
R. J. Hyndman and Y. Fan. Sample quantiles in statistical packages. The American Statistician, 50:361--365, 1996.
[18]
B. J. Jansen, A. Spink, and T. Saracevic. Real life, real users, and real needs: A study and analysis of user queries on the Web. Inf. Process. Manage., 36(2):207--227, 2000.
[19]
K. J\"arvelin and J. Kek\"al\"ainen. Cumulated gain-based evaluation of IR techniques. ACM Trans. Inf. Syst., 20(4):422--446, 2002.
[20]
D. Lee, J. Kang, P. Mitra, C. L. Giles, and B.-W. On. Are your citations clean? Commun. ACM, 50(12):33--38, 2007.
[21]
L. Leydesdorff. Alternatives to the journal impact factor: I3 and the top-10% (or top-25%?) of the most-highly cited papers. Scientometrics, 92(2):355--365, 2012.
[22]
J. Lundberg. Lifting the crown - citation z-score. J. Informetrics, 1(2):145--154, 2007.
[23]
E. A. Oliveira, E. A. Colosimo, D. R. Martelli, I. G. Quirino, M. C. Oliveira, L. S. Lima, A. C. Sim\ oes E Silva, and H. Martelli-Júnior. Comparison of Brazilian researchers in clinical medicine: are criteria for ranking well-adjusted? Scientometrics, 90(2):429--443, 2012.
[24]
I. Podlubny. Comparison of scientific impact expressed by the number of citations in different fields of science. Scientometrics, 64(1):95--99, 2005.
[25]
F. Radicchi, S. Fortunato, and C. Castellano. Universality of citation distributions: Toward an objective measure of scientific impact. PNAS, 105:17268--17272, 2008.
[26]
A. Veloso, W. Meira, Jr., M. Gonçalves, and M. Zaki. Multi-label lazy associative classification. In Procs. of ECML/PKDD, pages 605--612, 2007.
[27]
A. Veloso, W. Meira Jr., and M. J. Zaki. Lazy associative classification. In Procs. of ICDM, pages 645--654, Washington, DC, USA, 2006.
[28]
L. Waltman, N. J. van Eck, T. N. van Leeuwen, M. S. Visser, and A. F. J. van Raan. Towards a new crown indicator: Some theoretical considerations. CoRR, abs/1003.2167, 2010.

Cited By

View all
  • (2025)Investigating scholarly indices and their contribution to recognition patterns among awarded and non-awarded researchersInternational Journal of Data Science and Analytics10.1007/s41060-024-00702-xOnline publication date: 15-Jan-2025
  • (2023)Machine Learning Approach for Effective Ranking of Researcher Assessment ParametersIEEE Access10.1109/ACCESS.2023.333695011(133294-133312)Online publication date: 2023
  • (2023)Evaluating the Effectiveness of Author-Count Based Metrics in Measuring Scientific ContributionsIEEE Access10.1109/ACCESS.2023.330941611(101710-101726)Online publication date: 2023
  • Show More Cited By

Index Terms

  1. Aggregating productivity indices for ranking researchers across multiple areas

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    JCDL '13: Proceedings of the 13th ACM/IEEE-CS joint conference on Digital libraries
    July 2013
    480 pages
    ISBN:9781450320771
    DOI:10.1145/2467696
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 22 July 2013

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. bibliometric indicators
    2. cross-disciplinarity
    3. ranking aggregation
    4. research performance

    Qualifiers

    • Research-article

    Conference

    JCDL '13
    Sponsor:
    JCDL '13: 13th ACM/IEEE-CS Joint Conference on Digital Libraries
    July 22 - 26, 2013
    Indiana, Indianapolis, USA

    Acceptance Rates

    JCDL '13 Paper Acceptance Rate 28 of 95 submissions, 29%;
    Overall Acceptance Rate 415 of 1,482 submissions, 28%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 03 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)Investigating scholarly indices and their contribution to recognition patterns among awarded and non-awarded researchersInternational Journal of Data Science and Analytics10.1007/s41060-024-00702-xOnline publication date: 15-Jan-2025
    • (2023)Machine Learning Approach for Effective Ranking of Researcher Assessment ParametersIEEE Access10.1109/ACCESS.2023.333695011(133294-133312)Online publication date: 2023
    • (2023)Evaluating the Effectiveness of Author-Count Based Metrics in Measuring Scientific ContributionsIEEE Access10.1109/ACCESS.2023.330941611(101710-101726)Online publication date: 2023
    • (2022)ScholarRec: a scholars’ recommender system that combines scholastic influence and social collaborations in academic social networksInternational Journal of Data Science and Analytics10.1007/s41060-022-00345-w16:2(203-216)Online publication date: 14-Jul-2022
    • (2022)An artificial intelligence-based framework for data-driven categorization of computer scientists: a case study of world’s Top 10 computing departmentsScientometrics10.1007/s11192-022-04627-9128:3(1513-1545)Online publication date: 31-Dec-2022
    • (2020)Quantifying Success in Science: An OverviewIEEE Access10.1109/ACCESS.2020.30077098(123200-123214)Online publication date: 2020
    • (2020)On knowledge-transfer characterization in dynamic attributed networksSocial Network Analysis and Mining10.1007/s13278-020-00657-410:1Online publication date: 13-Jun-2020
    • (2020)On interdisciplinary collaborations in scientific coauthorship networks: the case of the Brazilian communityScientometrics10.1007/s11192-020-03605-3Online publication date: 13-Jul-2020
    • (2019)Characterizing knowledge-transfer relationships in dynamic attributed networksProceedings of the 2019 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining10.1145/3341161.3342883(234-241)Online publication date: 27-Aug-2019
    • (2019)A Novel Pareto-VIKOR Index for Ranking Scientists’ Publication Impacts: A Case Study on Evolutionary Computation Researchers2019 IEEE Congress on Evolutionary Computation (CEC)10.1109/CEC.2019.8790104(2458-2465)Online publication date: Jun-2019
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media