Skip to main content

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 3237))

Included in the following conference series:

Abstract

We describe the overall organization of the CLEF 2003 evaluation campaign, with a particular focus on the cross-language ad hoc and domain-specific retrieval tracks. The paper discusses the evaluation approach adopted, describes the tracks and tasks offered and the test collections used, and provides an outline of the guidelines given to the participants. It concludes with an overview of the techniques employed for results calculation and analysis for the monolingual, bilingual and multilingual and GIRT tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Cleverdon, C.: The Cranfield Tests on Index Language Devices. In: Sparck-Jones, K., Willett, P. (eds.) Readings in Information Retrieval, pp. 47–59. Morgan Kaufmann, San Francisco (1997)

    Google Scholar 

  2. Harman, D.: The TREC Conferences. In: Kuhlen, R., Rittberger, M. (eds.) Hypertext - Information Retrieval - Multimedia: Synergieeffekte Elektronischer Informationssysteme, Proceedings of HIM 1995, Universitätsverlag Konstanz, pp. 9–28 (1995)

    Google Scholar 

  3. Voorhees, E.: The Philosophy of Information Retrieval Evaluation. In: Peters, C., Braschler, M., Gonzalo, J., Kluck, M. (eds.) CLEF 2001. LNCS, vol. 2406, pp. 355–370. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  4. Text REtrieval Conference (TREC) Series: http://trec.nist.gov/

  5. NTCIR (NII-NACSIS Test Collection for IR Systems): http://research.nii.ac.jp/ntcir/

  6. Braschler, M.: CLEF 2003 - Overview of Results. In: Peters, C., Gonzalo, J., Braschler, M., Kluck, M. (eds.) CLEF 2003. LNCS, vol. 3237, pp. 44–63. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  7. Gey, F.C., Kluck, M.: The Domain-Specific Task of CLEF – Specific Evaluation Strategies in Cross-Language Information Retrieval. In: Peters, C. (ed.) CLEF 2000. LNCS, vol. 2069, pp. 48–56. Springer, Heidelberg (2001)

    Chapter  Google Scholar 

  8. Jones, G.J.F., Federico, M.: Cross-Language Spoken Document Retrieval Pilot Track Report. In: Peters, C., Braschler, M., Gonzalo, J. (eds.) CLEF 2002. LNCS, vol. 2785, pp. 446–457. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  9. Womser-Hacker, C.: Multilingual Topic Generation within the CLEF 2001 Experiments. In: Peters, C., Braschler, M., Gonzalo, J., Kluck, M. (eds.) Evaluation of Cross-Language Information Retrieval Systems. LNCS, vol. 2069, pp. 389–393. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  10. Mandl, T., Womser-Hacker, C.: Linguistic and Statistical Analysis of the CLEF Topics (this volume)

    Google Scholar 

  11. ftp://ftp.cs.cornell.edu/pub/smart/

  12. Schäuble, P.: Content-Based Information Retrieval from Large Text and Audio Databases. Section 1.6 Evaluation Issues, pp. 22–29. Kluwer Academic Publishers, Dordrecht (1997)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Braschler, M., Peters, C. (2004). CLEF 2003 Methodology and Metrics. In: Peters, C., Gonzalo, J., Braschler, M., Kluck, M. (eds) Comparative Evaluation of Multilingual Information Access Systems. CLEF 2003. Lecture Notes in Computer Science, vol 3237. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30222-3_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-30222-3_2

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-24017-4

  • Online ISBN: 978-3-540-30222-3

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics