Abstract
This year marked UAIC’s first participation at the INFILE@CLEF competition. The purpose of this campaign is the evaluation of cross-language filtering systems, which is to successfully build an automated system that separates relevant from non-relevant documents written in different languages with respect to a given profile. For the batch filtering task, participants are provided with the whole document collection and must return the list of relevant documents for each topic. We achieved good results in filtering documents, also obtaining the highest originality score, when having English as target language. Our team was also the only one who submitted runs for cross-lingual and multilingual batch filtering, with French and English/French as target languages. A brief description of our system, including presentation of the Parsing, Indexing and Filtering modules is given in this paper, as well as the results of the submitted runs.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Besançon, R., Chaudiron, S., Mostefa, D., Hamon, O., Timimi, I., Choukri, K.: Overview of the CLEF 2008 INFILE Pilot Track. In: Peters, C., Deselaers, T., Ferro, N., Gonzalo, J., Jones, G.J.F., Kurimo, M., Mandl, T., Peñas, A., Petras, V. (eds.) CLEF 2008. LNCS, vol. 5706, Springer, Heidelberg (2009)
Besançon, R., Chaudiron, S., Mostefa, D., Timimi, I., Choukri, K., Laïb, M.: Overview of CLEF 2009 INFILE track. Track. In: Working Notes of the Cross Language Evaluation Forum (CLEF 2009), Corfu, Greece (September 2009)
Hatcher, E., Gospodnetic, O.: Lucene in action. Manning Publications Co. (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Drăguşanu, CA., Grigoriu, A., Iftene, A. (2010). UAIC: Participation in INFILE@CLEF Task. In: Peters, C., et al. Multilingual Information Access Evaluation I. Text Retrieval Experiments. CLEF 2009. Lecture Notes in Computer Science, vol 6241. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-15754-7_43
Download citation
DOI: https://doi.org/10.1007/978-3-642-15754-7_43
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-15753-0
Online ISBN: 978-3-642-15754-7
eBook Packages: Computer ScienceComputer Science (R0)