Abstract
Search engines have difficulty searching into knowledge repositories because they are not tailored to the users’ differing information needs. User queries are, more often than not, under-specified or contain ambiguous terms that also retrieve irrelevant documents. Personalized query reformulation aims to refine queries per user, enhancing the relevance of search results while avoiding semantic drift. This task remains challenging due to the inadequate number of queries in user’s search sessions, let alone a query itself often suffers from ambiguity and is too brief. Existing methods have employed session history or click-throughs to enrich the query context, though one crucial cue has been overlooked: the user herself. In this paper, we propose leveraging conditional transformers such as the text-to-text transfer transformer (t5) to incorporate a user-tailored pretext to the input sequence as prior conditions to generate personalized reformulation of queries in the output sequence. Our experiments on the aol query log demonstrated the effectiveness of t5 in personalized query reformulation, without any loss of generality to other conditional transformers. The codebase to support the reproducibility of our research is available at https://github.com/fani-lab/RePair/tree/uid-wise24.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Ahmad, W.U., Chang, K., Wang, H.: Context attentive document ranking and query suggestion. In: Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR, pp. 385–394. ACM (2019)
Arabzadeh, N., Bigdeli, A., Seyedsalehi, S., Zihayat, M., Bagheri, E.: Matches made in heaven: toolkit and large-scale datasets for supervised query reformulation. In: CIKM 2021: The 30th ACM International Conference on Information and Knowledge Management, pp. 4417–4425. ACM (2021)
Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: 3rd International Conference on Learning Representations, ICLR (2015)
Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, EMNLP, pp. 1724–1734. ACL (2014)
Dehghani, M., Rothe, S., Alfonseca, E., Fleury, P.: Learning to attend, copy, and generate for session-based query suggestion. In: 2017 ACM on Conference on Information and Knowledge Management, pp. 1747–1756 (2017)
Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT, pp. 4171–4186. Association for Computational Linguistics (2019)
Fan, A., Lewis, M., Dauphin, Y.N.: Hierarchical neural story generation. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, ACL, pp. 889–898. Association for Computational Linguistics (2018)
Graves, A.: Generating sequences with recurrent neural networks. CoRR abs/1308.0850 (2013)
Jean, S., Cho, K., Memisevic, R., Bengio, Y.: On using very large target vocabulary for neural machine translation. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, ACL, pp. 1–10. The Association for Computer Linguistics (2015)
Kalchbrenner, N., Blunsom, P.: Recurrent continuous translation models. In: Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, EMNLP, pp. 1700–1709. ACL (2013)
Keskar, N., McCann, B., Varshney, L.R., Xiong, C., Socher, R.: CTRL: a conditional transformer language model for controllable generation. CoRR abs/1909.05858 (2019)
Khattab, O., Zaharia, M.: ColBERT: efficient and effective passage search via contextualized late interaction over BERT. In: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR, pp. 39–48. ACM (2020)
Liu, Y., et al.: RoBERTa: a robustly optimized BERT pretraining approach. CoRR abs/1907.11692 (2019)
Luong, T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, EMNLP, pp. 1412–1421. The Association for Computational Linguistics (2015)
MacAvaney, S., Macdonald, C., Ounis, I.: Reproducing personalised session search over the AOL query log. In: Hagen, M., et al. (eds.) ECIR 2022. LNCS, vol. 13185, pp. 627–640. Springer, Cham (2022). https://doi.org/10.1007/978-3-030-99736-6_42
MacAvaney, S., Yates, A., Feldman, S., Downey, D., Cohan, A., Goharian, N.: Simplified data wrangling with IR_datasets. In: SIGIR 2021: The 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 2429–2436. ACM (2021)
McCann, B., Keskar, N.S., Xiong, C., Socher, R.: The natural language decathlon: multitask learning as question answering. CoRR abs/1806.08730 (2018)
Mikolov, T., Karafiát, M., Burget, L., Cernocký, J., Khudanpur, S.: Recurrent neural network based language model. In: 11th Annual Conference of the International Speech Communication Association, INTERSPEECH, pp. 1045–1048. ISCA (2010)
Mitra, R., Gupta, M., Dandapat, S.: Transformer models for recommending related questions in web search. In: CIKM 2020: The 29th ACM International Conference on Information and Knowledge Management, pp. 2153–2156. ACM (2020)
Nguyen, T., et al.: MS MARCO: a human generated machine reading comprehension dataset. In: Proceedings of the Workshop on Cognitive Computation: Integrating neural and symbolic approaches 2016 co-located with the 30th Annual Conference on Neural Information Processing Systems (NIPS. CEUR Workshop Proceedings, vol. 1773. CEUR-WS.org (2016)
Nogueira, R., Lin, J., Epistemic, A.: From doc2query to docTTTTTquery. Online preprint 6 (2019)
Nogueira, R.F., Yang, W., Lin, J., Cho, K.: Document expansion by query prediction. CoRR abs/1904.08375 (2019)
Radford, A., et al.: Language models are unsupervised multitask learners. OpenAI Blog 1(8), 9 (2019)
Raffel, C., et al.: Exploring the limits of transfer learning with a unified text-to-text transformer. J. Mach. Learn. Res. 21, 140:1–140:67 (2020)
Robertson, S.E., Zaragoza, H.: The probabilistic relevance framework: BM25 and beyond. Found. Trends Inf. Retr. 3(4), 333–389 (2009)
Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Advances in Neural Information Processing Systems 27: Annual Conference on Neural Information Processing Systems, pp. 3104–3112 (2014)
Tamannaee, M., Fani, H., Zarrinkalam, F., Samouh, J., Paydar, S., Bagheri, E.: ReQue: a configurable workflow and dataset collection for query refinement. In: CIKM 2020: The 29th ACM International Conference on Information and Knowledge Management, Virtual Event, Ireland, 19–23 October 2020, pp. 3165–3172. ACM (2020)
Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems, pp. 5998–6008 (2017)
Yang, H., Zhao, X., Wang, Y., Li, M., Chen, W., Huang, W.: DGQAN: dual graph question-answer attention networks for answer selection. In: SIGIR 2022: The 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 1230–1239. ACM (2022)
Ye, C., Liao, L., Feng, F., Ji, W., Chua, T.: Structured and natural responses co-generation for conversational search. In: SIGIR 2022: The 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 155–164. ACM (2022)
Zeng, H., Kallumadi, S., Alibadi, Z., Nogueira, R.F., Zamani, H.: A personalized dense retrieval framework for unified information access. In: Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR, pp. 121–130. ACM (2023)
Zerveas, G., Zhang, R., Kim, L., Eickhoff, C.: Brown university at TREC deep learning 2019. In: Proceedings of the Twenty-Eighth Text REtrieval Conference, TREC. NIST Special Publication, vol. 1250. National Institute of Standards and Technology (NIST) (2019)
Zhang, X.: Improving personalised query reformulation with embeddings. J. Inf. Sci. 48(4), 503–523 (2022)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2025 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Narayanan, Y.L., Fani, H. (2025). RePair My Queries: Personalized Query Reformulation via Conditional Transformers. In: Barhamgi, M., Wang, H., Wang, X. (eds) Web Information Systems Engineering – WISE 2024. WISE 2024. Lecture Notes in Computer Science, vol 15436. Springer, Singapore. https://doi.org/10.1007/978-981-96-0579-8_16
Download citation
DOI: https://doi.org/10.1007/978-981-96-0579-8_16
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-96-0578-1
Online ISBN: 978-981-96-0579-8
eBook Packages: Computer ScienceComputer Science (R0)