skip to main content
10.1145/2661829.2661900acmconferencesArticle/Chapter ViewAbstractPublication PagescikmConference Proceedingsconference-collections
poster

Revisiting the Divergence Minimization Feedback Model

Published: 03 November 2014 Publication History

Abstract

Pseudo-relevance feedback (PRF) has proven to be an effective strategy for improving retrieval accuracy. In this paper, we revisit a PRF method based on statistical language models, namely the divergence minimization model (DMM). DMM not only has apparently sound theoretical foundation, but also has been shown to satisfy most of the retrieval constraints. However, it turns out to perform surprisingly poorly in many previous experiments. We investigate the cause, and reveal that DMM inappropriately tackles the entropy of the feedback model, which generates highly skewed feedback model. To address this problem, we propose a maximum-entropy divergence minimization model (MEDMM) by introducing an entropy term to regularize DMM. Our experiments on various TREC collections demonstrate that MEDMM not only works much better than DMM, but also outperforms several other state of the art PRF methods, especially on web collections. Moreover, unlike existing PRF models that have to be combined with the original query to perform well, MEDMM can work effectively even without being combined with the original query.

References

[1]
N. Abdul-Jaleel, J. Allan, W. B. Croft, F. Diaz, L. Larkey, X. Li, D. Metzler, M. D. Smucker, T. Strohman, H. Turtle, and C. Wade. Umass at trec 2004: Novelty and hard. In TREC '04, 2004.
[2]
C. Buckley, G. Salton, J. Allan, and A. Singhal. Automatic query expansion using smart: Trec 3. In TREC '94, pages 69--80, 1994.
[3]
S. Clinchant and E. Gaussier. A theoretical analysis of pseudo-relevance feedback models. In ICTIR '13, pages 6--13, 2013.
[4]
K. Collins-Thompson. Reducing the risk of query expansion via robust constrained optimization. In CIKM '09, pages 837--846, 2009.
[5]
J. V. Dillon and K. Collins-Thompson. A unified optimization framework for robust pseudo-relevance feedback algorithms. In CIKM '10, pages 1069--1078, 2010.
[6]
J. D. Lafferty and C. Zhai. Document language models, query models, and risk minimization for information retrieval. In SIGIR '01, pages 111--119, 2001.
[7]
V. Lavrenko and W. B. Croft. Relevance-based language models. In SIGIR '01, pages 120--127, 2001.
[8]
G. J. Lidstone. Note on the general case of the bayes-laplace formula for inductive or a posteriori probabilities. Transactions of the Faculty of Actuaries, 8:182--192, 1920.
[9]
Y. Lv and C. Zhai. A comparative study of methods for estimating query language models with pseudo feedback. In CIKM '09, pages 1895--1898, 2009.
[10]
Y. Lv and C. Zhai. Positional relevance model for pseudo-relevance feedback. In SIGIR '10, pages 579--586, 2010.
[11]
D. Metzler and W. B. Croft. Latent concept expansion using markov random fields. In SIGIR '07, pages 311--318, 2007.
[12]
S. E. Robertson and K. S. Jones. Relevance weighting of search terms. JASIS, 27(3):129--146, 1976.
[13]
J. J. Rocchio. Relevance feedback in information retrieval. In In The SMART Retrieval System: Experiments in Automatic Document Processing, pages 313--323. Prentice-Hall Inc., 1971.
[14]
T. Tao and C. Zhai. Regularized estimation of mixture models for robust pseudo-relevance feedback. In SIGIR '06, pages 162--169, 2006.
[15]
C. Zhai and J. D. Lafferty. Model-based feedback in the language modeling approach to information retrieval. In CIKM '01, pages 403--410, 2001.
[16]
C. Zhai and J. D. Lafferty. A study of smoothing methods for language models applied to ad hoc information retrieval. In SIGIR '01, pages 334--342, 2001.

Cited By

View all
  • (2024)A Survey of Model Compression and Its Feedback Mechanism in Federated LearningProceedings of the 5th ACM Workshop on Intelligent Cross-Data Analysis and Retrieval10.1145/3643488.3660293(37-42)Online publication date: 10-Jun-2024
  • (2023)Query Context Expansion for Open-Domain Question AnsweringACM Transactions on Asian and Low-Resource Language Information Processing10.1145/360349822:8(1-21)Online publication date: 23-Aug-2023
  • (2023)Entity-Based Relevance Feedback for Document RetrievalProceedings of the 2023 ACM SIGIR International Conference on Theory of Information Retrieval10.1145/3578337.3605128(177-187)Online publication date: 9-Aug-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CIKM '14: Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management
November 2014
2152 pages
ISBN:9781450325981
DOI:10.1145/2661829
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 03 November 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. additive smoothing
  2. divergence minimization
  3. maximum entropy
  4. query language model

Qualifiers

  • Poster

Funding Sources

Conference

CIKM '14
Sponsor:

Acceptance Rates

CIKM '14 Paper Acceptance Rate 175 of 838 submissions, 21%;
Overall Acceptance Rate 1,861 of 8,427 submissions, 22%

Upcoming Conference

CIKM '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)0
Reflects downloads up to 28 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)A Survey of Model Compression and Its Feedback Mechanism in Federated LearningProceedings of the 5th ACM Workshop on Intelligent Cross-Data Analysis and Retrieval10.1145/3643488.3660293(37-42)Online publication date: 10-Jun-2024
  • (2023)Query Context Expansion for Open-Domain Question AnsweringACM Transactions on Asian and Low-Resource Language Information Processing10.1145/360349822:8(1-21)Online publication date: 23-Aug-2023
  • (2023)Entity-Based Relevance Feedback for Document RetrievalProceedings of the 2023 ACM SIGIR International Conference on Theory of Information Retrieval10.1145/3578337.3605128(177-187)Online publication date: 9-Aug-2023
  • (2023)Pseudo Relevance Feedback with Deep Language Models and Dense Retrievers: Successes and PitfallsACM Transactions on Information Systems10.1145/357072441:3(1-40)Online publication date: 10-Apr-2023
  • (2022)LoLProceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval10.1145/3477495.3532017(825-836)Online publication date: 6-Jul-2022
  • (2022)Improving Query Representations for Dense Retrieval with Pseudo Relevance Feedback: A Reproducibility StudyAdvances in Information Retrieval10.1007/978-3-030-99736-6_40(599-612)Online publication date: 10-Apr-2022
  • (2021)QA4PRF: A Question Answering Based Framework for Pseudo Relevance FeedbackIEEE Access10.1109/ACCESS.2021.31186009(139303-139314)Online publication date: 2021
  • (2021)Pseudo relevance feedback optimizationInformation Retrieval10.1007/s10791-021-09393-524:4-5(269-297)Online publication date: 1-Oct-2021
  • (2021)PGT: Pseudo Relevance Feedback Using a Graph-Based TransformerAdvances in Information Retrieval10.1007/978-3-030-72240-1_46(440-447)Online publication date: 28-Mar-2021
  • (2020)A Reinforcement Learning Framework for Relevance FeedbackProceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval10.1145/3397271.3401099(59-68)Online publication date: 25-Jul-2020
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media