Abstract
In this talk I summarize the components of a traditional laboratory-style evaluation experiment in information retrieval (as exemplified by TREC), and discusses some of the issues around this form of experiment. Some kinds of research questions fit very well into this framework ; others much less easily. The major area of dificulty for the framework is the area concerned with the user interface and user informationseeking behaviour. I go on to discuss a series of experiments conducted at City University with the Okapi system, both of the traditional form and of a more user-oriented type. I then discuss the current TREC filtering track, which does not present quite such severe problems, but is nevertheless based on a simple model of how users might interact with the system ; this has some effect on the experimental methodology.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Sparck Jones, K. (ed.): Information retrieval experiment. Butterworths, London, 1981. Also available at http://www.itl.nist.gov/iad/894.02/projects/irlib/pubs/ire/iretoc.html
Robertson, S.E.: The methodology of information retrieval experiment. In: Sparck Jones, K. [1] 2–31.
Robertson, S.E. and Hancock-Beaulieu, M.: On the evaluation of IR systems, Information Processing & Management, 28:457–466, 1992.
A special issue devoted to work with the Okapi system at City University: Journal of Documentation, 53:1, 1997. Overview article: Robertson, S.E.: Overview of the Okapi projects, 3–7. User interface evaluation: Beaulieu, M.: Experiments on interfaces to support query expansion, 8-19.
Hull, D. and Robertson, S.E.: The TREC-8 Filtering Track Final Report. Available from http://trec.nist.gov/pubs/trec8/t8_proceedings.html
Buckley, C. and Voorhees, E.: Evaluating evaluation measure stability. In: Belkin, N.J., Ingwersen, P. and Leong, M.-K. (eds): SIGIR 2000. ACM Press, New York (2000) 33–40
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Robertson, S. (2000). Evaluation in Information Retrieval. In: Agosti, M., Crestani, F., Pasi, G. (eds) Lectures on Information Retrieval. ESSIR 2000. Lecture Notes in Computer Science, vol 1980. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45368-7_4
Download citation
DOI: https://doi.org/10.1007/3-540-45368-7_4
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-41933-4
Online ISBN: 978-3-540-45368-0
eBook Packages: Springer Book Archive