Skip to main content

Web Search Relevance Feedback

  • Reference work entry
  • First Online:
  • 31 Accesses

Definition

Relevance feedback refers to an interactive cycle that helps to improve the retrieval performance based on the relevance judgments provided by a user. Specifically, when a user issues a query to describe an information need, an information retrieval system would first return a set of initial results and then ask the user to judge whether some information items (typically documents or passages) are relevant or not. After that, the system would reformulate the query based on the collected feedback information, and return a set of retrieval results, which presumably would be better than the initial retrieval results. This procedure could be repeated.

Historical Background

Quality of retrieval results highly depends on how effective a user’s query (usually a set of keywords) is in distinguishing relevant documents from non-relevant ones. Ideally, the keywords used in the query should occur only in the relevant documents and not in any non-relevant document. Unfortunately, in...

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   4,499.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   6,499.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Recommended Reading

  1. Allan J. Relevance feedback with too much data. In: Proceedings of the 18th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval; 1995. p. 337–43.

    Google Scholar 

  2. Buckley C. Automatic query expansion using SMART: TREC-3. In: Harman D, editor. Proceedings of The Third Text Retrieval Conference; 1995. p. 69–80.

    Google Scholar 

  3. Burges C, Shaked T, Renshaw E, Lazier A, Deeds M, Hamilton N, Hullender G. Learning to rank using gradient descent. In: Proceedings of the 22nd International Conference on Machine Learning; 2005. p. 89–96.

    Google Scholar 

  4. Joachims T. Optimizing search engines using clickthrough data. In: Proceedings of the 8th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining; 2002. p. 133–42.

    Google Scholar 

  5. Kelly D, Teevan J. Implicit feedback for inferring user preference. SIGIR Forum. 2003;37(2):18–28.

    Article  Google Scholar 

  6. Robertson SE, Jones KS. Relevance weighting of search terms. J Am Soc Inf Sci. 1976;27(3):129–46.

    Article  Google Scholar 

  7. Robertson SE, Walker S, Jones S, Hancock-Beaulieu MM, Gatford M. Okapi at TREC-3. In: Proceedings of the 3rd Text Retrieval Conference; 1995. p. 109–26.

    Google Scholar 

  8. Rocchio J. J. Relevance feedback in information retrieval. In: The SMART retrieval system: experiments in automatic document processing. Englewood Cliffs: Prentice-Hall; 1971. p. 313–23.

    Google Scholar 

  9. Ruthven I, Lalmas M. A survey on the use of relevance feedback for information access system. Knowl Eng Rev. 2003;18(2):95–145.

    Article  Google Scholar 

  10. Salton G, Buckley C. Improving retrieval performance by relevance feedback. J Am Soc Inf Sci. 1990;44(4):288–97.

    Article  Google Scholar 

  11. Shen X, Zhai C. Active feedback in ad hoc information retrieval. In: Proceedings of the 31st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval; 2005. p. 59–66.

    Google Scholar 

  12. Singhal A, Mitra M, Buckley C. Learning routing queries in a query zone. In: Proceedings of the 20th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval; 1997. p. 25–32.

    Google Scholar 

  13. Wang X, Fang H, Zhai C. A study of methods for negative relevance feedback. In: Proceedings of the 34th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval; 2008. p. 219–26.

    Google Scholar 

  14. Xu J, Croft WB. Query expansion using local and global document analysis. In: Proceedings of the 19th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval; 1996. p. 4–11.

    Google Scholar 

  15. Zhai C, Lafferty J. Model-based feedback in the language modeling approach to information retrieval. In: Proceedings of the 10th International Conference on Information and Knowledge Management; 2001. p. 403–10.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hui Fang .

Editor information

Editors and Affiliations

Section Editor information

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Science+Business Media, LLC, part of Springer Nature

About this entry

Check for updates. Verify currency and authenticity via CrossMark

Cite this entry

Fang, H., Zhai, C.X. (2018). Web Search Relevance Feedback. In: Liu, L., Özsu, M.T. (eds) Encyclopedia of Database Systems. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-8265-9_462

Download citation

Publish with us

Policies and ethics