Skip to main content

Sampling Precision to Depth 10000 at CLEF 2007

  • Conference paper
Advances in Multilingual and Multimodal Information Retrieval (CLEF 2007)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 5152))

Included in the following conference series:

Abstract

We conducted an experiment to test the completeness of the relevance judgments for the monolingual Bulgarian, Czech and Hungarian information retrieval tasks of the Ad-Hoc Track of the Cross-Language Evaluation Forum (CLEF) 2007. In the ad hoc retrieval tasks, the system was given 50 natural language queries, and the goal was to find all of the relevant documents (with high precision) in a particular document set. For each language, we submitted a sample of the first 10000 retrieved items to investigate the frequency of relevant items at deeper ranks than the official judging depth (of 60 for Czech and 80 for Bulgarian and Hungarian). The results suggest that, on average, the percentage of relevant items assessed was less than 60% for Czech, 70% for Bulgarian and 85% for Hungarian. These levels of completeness are in line with the estimates that have been made for some past test collections which are still considered useful and fair for comparing retrieval methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Cross-Language Evaluation Forum web site, http://www.clef-campaign.org/

  2. Di Nunzio, G.M., Ferro, N., Mandl, T., Peters, C.: CLEF 2007 Ad Hoc Track Overview. In: Peters, C., et al. (eds.) CLEF 2007. LNCS, vol. 5152, pp. 13–32. Springer, Heidelberg (2008)

    Google Scholar 

  3. Harman, D.K.: The TREC Test Collections. In TREC: Experiment and Evaluation in Information Retrieval (2005)

    Google Scholar 

  4. Hodgson, A.: Converting the Fulcrum Search Engine to Unicode. In: Sixteenth International Unicode Conference (2000)

    Google Scholar 

  5. NTCIR (NII-NACSIS Test Collection for IR Systems), http://research.nii.ac.jp/~ntcadm/index-en.html

  6. Savoy, J.: CLEF and Multilingual information retrieval resource page, http://www.unine.ch/info/clef/

  7. Text REtrieval Conference (TREC), http://trec.nist.gov/

  8. Tomlinson, S.: Bulgarian and Hungarian Experiments with Hummingbird SearchServerTM at CLEF 2005. In: Peters, C., Gey, F.C., Gonzalo, J., Müller, H., Jones, G.J.F., Kluck, M., Magnini, B., de Rijke, M., Giampiccolo, D. (eds.) CLEF 2005. LNCS, vol. 4022. Springer, Heidelberg (2006)

    Google Scholar 

  9. Tomlinson, S.: Experiments with the Negotiated Boolean Queries of the TREC 2006 Legal Discovery Track. In: Proceedings of TREC 2006 (2006)

    Google Scholar 

  10. Tomlinson, S.: Sampling Precision to Depth 9000: Evaluation Experiments at NTCIR-6. In: Proceedings of NTCIR-6 (2007)

    Google Scholar 

  11. Zobel, J.: How Reliable are the Results of Large-Scale Information Retrieval Experiments? In: SIGIR 1998, pp. 307–314 (1998)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Carol Peters Valentin Jijkoun Thomas Mandl Henning Müller Douglas W. Oard Anselmo Peñas Vivien Petras Diana Santos

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Tomlinson, S. (2008). Sampling Precision to Depth 10000 at CLEF 2007. In: Peters, C., et al. Advances in Multilingual and Multimodal Information Retrieval. CLEF 2007. Lecture Notes in Computer Science, vol 5152. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-85760-0_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-85760-0_7

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-85759-4

  • Online ISBN: 978-3-540-85760-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics