skip to main content
10.1145/1148170.1148284acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
Article

Bias and the limits of pooling

Published: 06 August 2006 Publication History

Abstract

Modern retrieval test collections are built through a process called pooling in which only a sample of the entire document set is judged for each topic. The idea behind pooling is to find enough relevant documents such that when unjudged documents are assumed to be nonrelevant the resulting judgment set is sufficiently complete and unbiased. As document sets grow larger, a constant-size pool represents an increasingly small percentage of the document set, and at some point the assumption of approximately complete judgments must become invalid.This paper demonstrates that the AQUAINT 2005 test collection exhibits bias caused by pools that were too shallow for the document set size despite having many diverse runs contribute to the pools. The existing judgment set favors relevant documents that contain topic title words even though relevant documents containing few topic title words are known to exist in the document set. The paper concludes with suggested modifications to traditional pooling and evaluation methodology that may allow very large reusable test collections to be built.

References

[1]
Javed A. Aslam, Virgiliu Pavlu, and Emine Yilmaz.A sampling technique for efficiently estimating measures of query retrieval performance using incomplete judgments. In Proceedings of the 22nd ICML Workshop on Learning with Partially Classified Training Data pages 57--66, August 2005.
[2]
Chris Buckley and Gerard Salton. Optimization of relevance feedback weights. In Proceedings of SIGIR 1995, pages 351--357,1995.
[3]
Gordon V. Cormack, Christopher R. Palmer, and Charles L. A. Clarke. Efficient construction of large test collections. In Proceesings of SIGIR 1998, pages 282--289, 1998.
[4]
Justin Zobel. How reliable are the results of large-scale information retrieval experiments? In Proceedings of SIGIR 1998, pages 307--314, 1998.

Cited By

View all
  • (2024)A probabilistic model for API contract specification retrieval focusing on the openAPI standardData Mining and Knowledge Discovery10.1007/s10618-024-01073-439:1Online publication date: 13-Nov-2024
  • (2024)Generative Information Retrieval EvaluationInformation Access in the Era of Generative AI10.1007/978-3-031-73147-1_6(135-159)Online publication date: 12-Sep-2024
  • (2022)SenatusProceedings of the 19th International Conference on Mining Software Repositories10.1145/3524842.3527947(511-523)Online publication date: 23-May-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGIR '06: Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
August 2006
768 pages
ISBN:1595933697
DOI:10.1145/1148170
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 06 August 2006

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. pooling
  2. retrieval test collections

Qualifiers

  • Article

Conference

SIGIR06
Sponsor:
SIGIR06: The 29th Annual International SIGIR Conference
August 6 - 11, 2006
Washington, Seattle, USA

Acceptance Rates

Overall Acceptance Rate 792 of 3,983 submissions, 20%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)8
  • Downloads (Last 6 weeks)0
Reflects downloads up to 17 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)A probabilistic model for API contract specification retrieval focusing on the openAPI standardData Mining and Knowledge Discovery10.1007/s10618-024-01073-439:1Online publication date: 13-Nov-2024
  • (2024)Generative Information Retrieval EvaluationInformation Access in the Era of Generative AI10.1007/978-3-031-73147-1_6(135-159)Online publication date: 12-Sep-2024
  • (2022)SenatusProceedings of the 19th International Conference on Mining Software Repositories10.1145/3524842.3527947(511-523)Online publication date: 23-May-2022
  • (2022)Hard Negatives or False NegativesProceedings of the 31st ACM International Conference on Information & Knowledge Management10.1145/3511808.3557343(118-127)Online publication date: 17-Oct-2022
  • (2017)Measuring Effectiveness in the TREC Legal TrackCurrent Challenges in Patent Information Retrieval10.1007/978-3-662-53817-3_6(163-182)Online publication date: 26-Mar-2017
  • (2017)An Introduction to Contemporary Search TechnologyCurrent Challenges in Patent Information Retrieval10.1007/978-3-662-53817-3_2(47-73)Online publication date: 26-Mar-2017
  • (2016)Estimating the Reliability of the Retrieval Systems Rankings2016 International Conference on Software Networking (ICSN)10.1109/ICSN.2016.7501924(1-5)Online publication date: May-2016
  • (2015)Pooling for User-Oriented Evaluation MeasuresProceedings of the 2015 International Conference on The Theory of Information Retrieval10.1145/2808194.2809493(341-344)Online publication date: 27-Sep-2015
  • (2013)Choices in batch information retrieval evaluationProceedings of the 18th Australasian Document Computing Symposium10.1145/2537734.2537745(74-81)Online publication date: 5-Dec-2013
  • (2013)An analysis of crowd workers mistakes for specific and complex relevance assessment taskProceedings of the 22nd ACM international conference on Information & Knowledge Management10.1145/2505515.2507884(1873-1876)Online publication date: 27-Oct-2013
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media