Abstract
This paper reviews the current status of academic search engines and emerging trends in scientific information retrieval and argues for two key claims. First, since systematic searches rely on the widespread use of academic search engines and the latter are generally not powered by cutting-edge Artificial Intelligence (AI) and not well-positioned to further the goals of findability and discoverability, there are some non-trivial epistemic costs associated with the tradition of systematic search. Second, while narrative reviews are typically criticized because of their lack of transparency, accountability, and reproducibility, they do deserve a place in scientific research. Specifically, once narrative reviews are properly understood as enabled by modern tools such as non-academic search engines, AI-powered recommender systems and academic social networks, it is possible to appreciate how these can indeed further the goal of literature discoverability. The upshot of this piece is that there are multiple goals and trade-offs involved in the process of scientific document search and that we should acknowledge virtues and limitations of different approaches to information retrieval and be prepared to welcome their combined use.
References
Baumeister, R. F., & Leary, M. R. (1997). Writing narrative literature reviews. Review of General Psychology,1(3), 311.
Baymard Institute. (2014). Deconstructing E-commerce search: The 12 query types. Retrieved from: https://baymard.com/blog/ecommerce-search-query-types. Accessed 14 Nov 2019.
Beall, J. (1998). The weaknesses of full-text searching. The Journal of Academic Librarianship,34(5), 438–444.
Bornmann, L., & Mutz, R. (2015). Growth rates of modern science: A bibliometric analysis based on the number of publications and cited references. Journal of the Association for Information Science and Technology,66(11), 2215–2222.
Ćurković, M., & Košec, A. (2018). Bubble effect: Including internet search engines in systematic reviews introduces selection bias and impedes scientific reproducibility. BMC Medical Research Methodology,18(1), 130.
Forrester. (2017). Revamp site search to jump-start ai, chat, and personalization. Retrieved from: https://www.forrester.com/report/Revamp+Site+Search+To+JumpStart+AI+Chat+And+Personalization/-/E-RES142261. Accessed 10 Nov 2019.
Glass, G. V. (1976). Primary, secondary, and meta-analysis of research. Educational Researcher,5(10), 3–8.
Higgins, J. P., & Green, S. (Eds.). (2011). Cochrane handbook for systematic reviews of interventions (Vol. 4). New York: Wiley.
Jones, N. (2016). AI science search engines expand their reach, nature. Retrieved from: https://www.nature.com/news/ai-science-search-engines-expand-their-reach-1.20964. Accessed 14 Nov 2019.
Kitsiou, S., Pare, G., & Jaana, M. (2013). Systematic reviews and meta-analyses of home telemonitoring interventions for patients with chronic diseases: A critical assessment of their methodological quality. Journal of Medical Internet Research,15(7), e150.
Light, R., & Smith, P. (1971). Accumulating evidence: Procedures for resolving contradictions among different research studies. Harvard Educational Review,41(4), 429–471.
Lopes, G., Moro, M, Wives, M, & De Oliveira, J. (2010). Collaboration recommendation on academic social networks. In International conference on conceptual modelling. Springer.
Ortega, J. L. (2014). Academic search engines: A quantitative outlook. Chicago: Elsevier.
Polonioli, A. (2019). A plea for minimally biased naturalistic philosophy. Synthese, 196(9), 3841–3867.
Rosenthal, R. (1978). Combining results of independent studies. Psychological Bulletin,85(1), 185–193.
Russell-Rose, T., & Tate, T. (2013). Designing the search experience. Waltham, MA: Morgan Kaufmann.
Slavin, R. E. (1986). Best-evidence synthesis: An alternative to meta-analytic and traditional reviews. Educational Researcher,15(9), 5–11.
Schubback, A. (2019). Judging machines: Philosophical aspects of deep learning. Synthese.
Statt, N. (2018). Google personalizes search results even when you’re logged out, new study claims. Retrieved from: https://www.theverge.com/2018/12/4/18124718/google-search-results-personalized-unique-duckduckgo-filter-bubble. Accessed 14 Nov 2019.
TechCrunch. (2014). Scientists gain a versatile, modern search engine with the AI-powered Semantic Scholar. Retrieved from: https://techcrunch.com/2016/11/11/scientists-gain-a-versatile-modern-search-engine-with-the-ai-powered-semantic-scholar/. Accessed 14 Nov 2019.
Templier, M., & Parè, G. (2015). A framework for guiding and evaluating literature reviews. Communications of the Association for Information Systems,37, 6.
Van Dijck, J. (2010). Search engines and the production of academic knowledge. International Journal of Cultural Studies,13(6), 574–592.
Ward, D., Hahn, J., & Feist, K. (2012). Autocomplete as research tool: A study on providing search suggestions. Information Technology and Libraries,31, 6–19.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Polonioli, A. In search of better science: on the epistemic costs of systematic reviews and the need for a pluralistic stance to literature search. Scientometrics 122, 1267–1274 (2020). https://doi.org/10.1007/s11192-019-03333-3
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-019-03333-3