ABSTRACT
In recent years, assessing the performance of researchers has become a burden due to the extensive volume of the existing research output. As a result, evaluators often end up relying heavily on a selection of performance indicators like the h-index. However, over-reliance on such indicators may result in reinforcing dubious research practices, while overlooking important aspects of a researcher's career, such as their exact role in the production of particular research works or their contribution to other important types of academic or research activities (e.g., production of datasets, peer reviewing). In response, a number of initiatives that attempt to provide guidelines towards fairer research assessment frameworks have been established. In this work, we present BIP! Scholar, a Web-based service that offers researchers the opportunity to set up profiles that summarise their research careers taking into consideration well-established guidelines for fair research assessment, facilitating the work of evaluators who want to be more compliant with the respective practices.
- Noémie Aubert Bonn and Wim Pinxten. 2021. Advancing science or advancing careers? Researchers' opinions on success indicators. PLOS ONE 16, 2 (02 2021), 1--17. Google ScholarCross Ref
- European Commission, Directorate-General for Research, and Innovation. 2017. Evaluation of research careers fully acknowledging Open Science practices : rewards, incentives and/or recognition for researchers practicing Open Science. Publications Office. Google ScholarCross Ref
- Gemma Derrick and Simon Hettrick. 2022. Time to celebrate science's' hidden'contributors. Nature (2022).Google Scholar
- Daniele Fanelli. 2010. Do pressures to publish increase scientists' bias? An empirical support from US States Data. PloS one 5, 4 (2010), e10271.Google ScholarCross Ref
- Jim Gray. 2009. Jim Gray on eScience: A Transformed Scientific Method. In The Fourth Paradigm: Data-Intensive Scientific Discovery. Microsoft Research, 17--31.Google Scholar
- Ginny Hendricks, Dominika Tkaczyk, Jennifer Lin, and Patricia Feeney. 2020. Crossref: The sustainable source of community-owned scholarly metadata. 1, 1 (2020), 414--427. Google ScholarCross Ref
- Diana Hicks, Paul Wouters, Ludo Waltman, Sarah De Rijcke, and Ismael Rafols. 2015. Bibliometrics: the Leiden Manifesto for research metrics. Nature 520, 7548 (2015), 429--431.Google Scholar
- Jorge E Hirsch. 2005. An index to quantify an individual's scientific research output. Proceedings of the National academy of Sciences 102, 46 (2005), 16569--16572.Google ScholarCross Ref
- Ilias Kanellos, Thanasis Vergoulis, Dimitris Sacharidis, Theodore Dalamagas, and Yannis Vassiliou. 2019. Impact-Based Ranking of Scientific Publications: A Survey and Experimental Evaluation. IEEE TKDE (2019).Google Scholar
- Ilias Kanellos, Thanasis Vergoulis, Dimitris Sacharidis, Theodore Dalamagas, and Yannis Vassiliou. 2020. Ranking Papers by their Short-Term Scientific Impact. arXiv:2006.00951 [cs.DL]Google Scholar
- Paolo Manghi, Nikos Houssos, Marko Mikulicic, and Brigitte Jörg. 2012. The data model of the openaire scientific communication e-infrastructure. In Research Conference on Metadata and Semantic Research. Springer, 168--180.Google ScholarCross Ref
- Robert K Merton. 1968. The Matthew Effect in Science: The reward and communication systems of science are considered. Science 159, 3810 (1968), 56--63.Google ScholarCross Ref
- David Moher, Lex Bouter, Sabine Kleinert, Paul Glasziou, Mai Har Sham, Virginia Barbour, Anne-Marie Coriat, Nicole Foeger, and Ulrich Dirnagl. 2020. The Hong Kong Principles for assessing researchers: Fostering research integrity. PLOS Biology 18, 7 (07 2020), 1--14. Google ScholarCross Ref
- Lawrence Page, Sergey Brin, Rajeev Motwani, and Terry Winograd. 1999. The PageRank citation ranking: Bringing order to the web. (1999).Google Scholar
- Nancy Pontika, Thomas Klebel, Antonia Correia, Hannah Metzler, Petr Knoth, and Tony Ross-Hellauer. 2022. Indicators of research quality, quantity, openness and responsibility in institutional promotion, review and tenure policies across seven countries. (2022).Google Scholar
- Arnab Sinha, Zhihong Shen, Yang Song, Hao Ma, Darrin Eide, Bo-June (Paul) Hsu, and Kuansan Wang. 2015. An Overview of Microsoft Academic Service (MAS) and Applications. In Proceedings of WWW '15 Companion. 243--246.Google ScholarDigital Library
- Michaela Strinzel, Josh Brown, Wolfgang Kaltenbrunner, Sarah de Rijcke, and Michael Hill. 2021. Ten ways to improve academic CVs for fairer research assessment. Humanities and Social Sciences Communications 8, 1 (2021), 1--4.Google ScholarCross Ref
- Elaine Svenonius. 2000. The intellectual foundation of information organization. MIT press.Google ScholarDigital Library
- Kuansan Wang, Zhihong Shen, Chiyuan Huang, Chieh-Han Wu, Yuxiao Dong, and Anshul Kanakia. 2020. Microsoft Academic Graph: When experts are not enough. QSS 1, 1 (2020), 396--413.Google Scholar
Index Terms
- BIP! SCHOLAR: A Service to Facilitate Fair Researcher Assessment
Recommendations
BIP! DB: A Dataset of Impact Measures for Scientific Publications
WWW '21: Companion Proceedings of the Web Conference 2021The growth rate of the number of scientific publications is constantly increasing, creating important challenges in the identification of valuable research and in various scholarly data management applications, in general. In this context, measures ...
A novel method for depicting academic disciplines through Google Scholar Citations: The case of Bibliometrics
This article describes a procedure to generate a snapshot of the structure of a specific scientific community and their outputs based on the information available in Google Scholar Citations (GSC). We call this method multifaceted analysis of ...
The scientometric portrait of Eugene Garfield through the free ResearcherID service from the Web of Science Core Collection of 67 million master records and 1.3 billion references
Eugene Garfield came up with the idea of citation-based searching in the early 1950s, and followed it by releasing three unique databases, for the Sciences, Social Sciences, and Arts & Humanities, as well as a yearly Journal Citations Report for the ...
Comments