Abstract
Due to the increasingly diversified research topics, approaches, and applications, there is a growing trend of scientific collaboration between academia and industry in information retrieval (IR) research. However, the characteristics in productivity, authorship, impact, and topics for publications by researchers from academia and industry and by academia-industry collaboration still remain understudied. In this paper, we examine the features and differences regarding productivity, authorship, and impact of the three types of studies and also pay special attention to the research problems and topics that attract and foster academia-industry collaborations in the recent two decades of IR studies. To this end, we analyzed 36,072 research papers published by 52,419 authors from 2000–2021 in the field of IR from ACM Digital Library. We find that the three categories have clear preferences in terms of selecting which academic conferences for publication. Regarding author teams, the industry community prefers small teams or solo-authored publications compared with the academic community. As for impact, papers by academia-industry collaboration tend to have higher citation impact compared with the research where only one community is involved. The thematic analysis of academia-industry collaborative papers and co-authorship network analysis reveal the preferred choice of research topics and the continuous “centrality” of researchers from academia. Knowledge from the study offers a new perspective for analyzing the advance and emerging trends in IR research and further clarifies the cross-community collaborations and scientific contributions of academia and industry.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
- 3.
- 4.
The specific topics and subareas of collaboration are extracted and inferred from individual research papers and main authors’ research profiles available via Google Scholar and DBLP.
References
Kobayashi, M., Takeda, K.: Information retrieval on the web. ACM Comput. Surv. (CSUR) 32(2), 144–173 (2000)
Castillo, C.: Fairness and transparency in ranking. ACM SIGIR Forum 52(2), 64–71 (2019)
Culpepper, J.S., Diaz, F., Smucker, M.D.: Research frontiers in information retrieval: report from the third strategic workshop on information retrieval in Lorne (SWIRL 2018). ACM SIGIR Forum 52(1), 34–90 (2018)
Ekstrand, M.D., Burke, R., Diaz, F.: Fairness and discrimination in retrieval and recommendation. In: Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 1403–1404, July 2019
Gao, R., Shah, C.: Addressing bias and fairness in search systems. In: Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 2643–2646, July 2021
Gao, J., Xiong, C., Bennett, P.: Recent advances in conversational information retrieval. In: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 2421–2424, July 2020
Li, H., Lu, Z.: Deep learning for information retrieval. In: Proceedings of the 39th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 1203–1206 (2016)
Olteanu, A., et al.: FACTS-IR: fairness, accountability, confidentiality, transparency, and safety in information retrieval. ACM SIGIR Forum 53(2), 20–43 (2021)
Thomas, P., Czerwinksi, M., McDuff, D., Craswell, N.: Theories of conversation for conversational IR. ACM Trans. Inf. Syst. (TOIS) 39(4), 1–23 (2021)
Yates, A., Nogueira, R., Lin, J.: Pretrained transformers for text ranking: BERT and beyond. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 1154–1156, March 2021
Zamani, H., Dumais, S., Craswell, N., Bennett, P., Lueck, G.: Generating clarifying questions for information retrieval. In: Proceedings of the Web Conference 2020, pp. 418–428, April 2020
Bornmann, L.: Measuring impact in research evaluations: a thorough discussion of methods for, effects of and problems with impact measurements. High Educ. 73, 775–787 (2017)
Zhang, C., Bu, Y., Ding, Y., Xu, J.: Understanding scientific collaboration: homophily, transitivity, and preferential attachment. J. Assoc. Inf. Sci. Technol. 69(1), 72–86 (2018)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Lei, J., Bu, Y., Liu, J. (2023). Information Retrieval Research in Academia and Industry: A Preliminary Analysis of Productivity, Authorship, Impact, and Topic Distribution. In: Sserwanga, I., et al. Information for a Better World: Normality, Virtuality, Physicality, Inclusivity. iConference 2023. Lecture Notes in Computer Science, vol 13972. Springer, Cham. https://doi.org/10.1007/978-3-031-28032-0_29
Download citation
DOI: https://doi.org/10.1007/978-3-031-28032-0_29
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-28031-3
Online ISBN: 978-3-031-28032-0
eBook Packages: Computer ScienceComputer Science (R0)