Abstract
The problem of finding documents that are written in a language that the searcher cannot read is perhaps the most challenging application of Cross-Language Information Retrieval (CLIR) technology. The first Cross-Language Evaluation Forum (CLEF) provided an excellent venue for assessing the performance of automated CLIR techniques, but little is known about how searchers and systems might interact to achieve better cross-language search results than automated systems alone can provide. This paper explores the question of how interactive approaches to CLIR might be evaluated, suggesting an initial focus on evaluation of interactive document selection. Important evaluation issues are identified, the structure of an interactive CLEF evaluation is proposed, and the key research communities that could be brought together by such an evaluation are introduced.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Workshop on interactive searching in foreign-language collections (2000) http://www.clis.umd.edu/conferences/hcil00
Davis, M., Ogden, W. C.: Quilt: Implementing a large-scale cross-language text retrieval system. In Proceedings of the 20th International ACM SIGIR Conference on Research and Development in Information Retrieval (1997)
Hearst, M. A.: User interfaces and visualization. In Baeza-Yates, R., Ribeiro-Neto, B., Modern Information Retrieval, chapter 10. Addison Wesley, New York (1999) http://www.sims.berkeley.edu/~hearst/irbook/chapters/chap10.html.
Hersh, W., Over, P.: TREC-8 interactive track report. In The Eighth Text REtrieval Conference (TREC-8) (1999) 57–64 http://trec.nist.gov.
Hersh, W., Turpin, A., Price, S., Chan, B., Kraemer, D., Sacherek, L., Olson, D.: Do batch and user evaluations give the same results? In Proceedings of the 23nd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (1998) 17–24
National Information Standards Organization: Guidelines for Abstracts (ANSI/NISO Z39.14-1997) NISO Press (1997)
Oard, D. W., Levow, G.-A., Cabezas, C. I.: TREC-9 experiments at Maryland: Interactive CLIR. In The Ninth Text Retrieval Conference (TREC-9) (2000) To appear. http://trec.nist.gov.
Oard, D. W., Resnik, P. Support for interactive document selection in crosslanguage information retrieval. Information Processing and Management 35(3) (1999) 363–379
Ogden, W., Cowie, J., Davis, M., Ludovik, E., Molina-Salgado, H., Shin, H.: Getting information from documents you cannot read: An interactive cross-language text retrieval and summarization system. In Joint ACM DL/SIGIR Workshop on Multilingual Information Discovery and Access (1999) http://www.clis.umd.edu/conferences/midas.html.
Over, P.: TREC-5 interactive track report. In The Fifth Text REtrieval Conference (TREC-5) (1996) 29–56 http://trec.nist.gov.
Over, P.: TREC-6 interactive track report. In The Sixth Text REtrieval Conference (TREC-6) (1997) 73–82 http://trec.nist.gov.
Over, P.: TREC-7 interactive track report. In The Seventh Text REtrieval Conference (TREC-7) (1998) 65–71 http://trec.nist.gov.
Taylor, K., White, J.: Predicting what MT is good for: User judgments and task performance. In Third Conference of the Association for Machine Translation in the Americas (1998) 364–373 Lecture Notes in Artificial Intelligence 1529.
van Rijsbergen, C. J.: Information Retrieval. Butterworths, London, second edition (1979)
Wilbur, W. J.: A comparison of group and individual performance among subject experts and untrained workers at the document retrieval task Journal of the American Society for Information Science, 49(6) (1998) 517–529
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Oard, D.W. (2001). Evaluating Interactive Cross-Language Information Retrieval: Document Selection. In: Peters, C. (eds) Cross-Language Information Retrieval and Evaluation. CLEF 2000. Lecture Notes in Computer Science, vol 2069. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44645-1_6
Download citation
DOI: https://doi.org/10.1007/3-540-44645-1_6
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42446-8
Online ISBN: 978-3-540-44645-3
eBook Packages: Springer Book Archive