Abstract
For the 2004 Cross-Language Evaluation Forum (CLEF) interactive track (iCLEF), five participating teams used a common evaluation design to assess the ability of interactive systems of their own design to support the task of finding specific answers to narrowly focused questions in a collection of documents written in a language different from the language in which the questions were expressed. This task is an interactive counterpart to the fully automatic cross-language question answering task at CLEF 2003 and 2004. This paper describes the iCLEF 2004 evaluation design, outlines the experiments conducted by the participating teams, and presents some initial results from analyses of official evaluation measures that were reported to each participating team.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Hersh, W., Over, P.: TREC-9 interactive track report. In: The Ninth Text Retrieval Conference (TREC-9) (November 2000), http://trec.nist.gov
Oard, D.W., Gonzalo, J.: The CLEF 2003 interactive track. In: Peters, C., Gonzalo, J., Braschler, M., Kluck, M. (eds.) CLEF 2003. LNCS, vol. 3237, pp. 425–434. Springer, Heidelberg (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Gonzalo, J., Oard, D.W. (2005). iCLEF 2004 Track Overview: Pilot Experiments in Interactive Cross-Language Question Answering. In: Peters, C., Clough, P., Gonzalo, J., Jones, G.J.F., Kluck, M., Magnini, B. (eds) Multilingual Information Access for Text, Speech and Images. CLEF 2004. Lecture Notes in Computer Science, vol 3491. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11519645_32
Download citation
DOI: https://doi.org/10.1007/11519645_32
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-27420-9
Online ISBN: 978-3-540-32051-7
eBook Packages: Computer ScienceComputer Science (R0)