Skip to main content

Training Students to Evaluate Search Engines

  • Chapter
  • First Online:
Teaching and Learning in Information Retrieval

Part of the book series: The Information Retrieval Series ((INRE,volume 31))

Abstract

In this chapter, two exercises that were part of an information retrieval course are described. Both exercises were set up to assess the qualities of a search engine and to consider how the engine could be improved. Students were engaged in problem-based learning. The exercises were run for several years and found to be successful both in engaging students in the course, but also in establishing links with organisations interested in evaluation of search.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    In more recent years, the quality of this search engine has been much improved.

  2. 2.

    http://www.alexa.com/.

  3. 3.

    http://www.nationalarchives.gov.uk/.

References

  • Bailey P, Craswell N, Soboroff I, Thomas P, de Vries AP, Yilmaz E (2008) Relevance assessment: are judges exchangeable and does it matter. In: Proceedings of the 31st annual international ACM SIGIR conference on research and development in information retrieval. ACM, New York, pp 667–674

    Google Scholar 

  • Barron BJS, Schwartz DL, Vye NJ, Moore A, Petrosino A, Zech L, Bransford JD (1998) Doing with understanding: lessons from research on problem-and project-based learning. J Learn Sci 7(3):271–311

    Article  Google Scholar 

  • Broder A (2002) A taxonomy of web search. SIGIR Forum 36(2):3–10. doi:10.1145/792550.792552

    Article  Google Scholar 

  • Chapelle O, Metlzer D, Zhang Y, Grinspan P (2009) Expected reciprocal rank for graded relevance. In: Proceeding of the 18th ACM conference on information and knowledge management. ACM, New York, pp 621–630

    Google Scholar 

  • Davies A (1983) A document test collection for use in information retrieval research, Dissertation. Department of Information Studies, University of Sheffield, Sheffield

    Google Scholar 

  • Gull CD (1956) Seven years of work on the organization of materials in the special library. Am Document 7(4):320–329. doi:10.1002/asi.5090070408

    Article  Google Scholar 

  • Heuwing B, Mandl T, Womser-Hacker C, Braschler M, Herget J, Schäuble P, Stucker J (2009) Evaluation der Suchfunktion deutscher Unternehmenswebsites. In: Wissensorganisation 09: “Wissen – Wissenschaft – Organisation” 12. Tagung der Deutschen ISKO (International Society for Knowledge Organization), Bonn

    Google Scholar 

  • Hogarth RM, Einhorn HJ (1992) Order effects in belief updating: the belief-adjustment model. Cogn Psychol 24(1):1–55. doi:10.1016/0010-0285(92)90002-J

    Article  Google Scholar 

  • Jansen BJ, Zhang M, Schultz CD (2009) Brand and its effect on user perception of search engine performance. J Am Soc Inf Sci Technol 60(8):1572–1595

    Article  Google Scholar 

  • Järvelin K, Kekäläinen J (2000) IR evaluation methods for retrieving highly relevant documents. In: Proceedings of the 23rd annual international ACM SIGIR conference on research and development in information retrieval. ACM, New York, pp 41–48

    Google Scholar 

  • Kilpatrick W (1918) The project method. Teach Coll Record 19(4):319–335

    Google Scholar 

  • Thorne R (1955) The efficiency of subject catalogues and the cost of information searches. J Document 11:130–148

    Google Scholar 

  • Voorhees EM (1998) Variations in relevance judgments and the measurement of retrieval effectiveness. In: Proceedings of the 21st annual international ACM SIGIR conference on research and development in information retrieval. ACM, New York, pp 315–323

    Google Scholar 

  • White M (2006) Making search work: implementing Web, intranet and enterprise search. Facet, London

    Google Scholar 

  • White RW, Morris D (2007) Investigating the querying and browsing behavior of advanced search engine users. In: Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval. ACM, New York, pp 255–262

    Google Scholar 

Download references

Acknowledgments

The author is grateful to TNA for allowing their search engine be used in the evaluation exercise. He also wishes to thank the reviewers of this chapter for their invaluable comments. Finally, many thanks to the students who took part in the exercises and whose innovation, feedback, and comments made running the IR course such a joy.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mark Sanderson .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Sanderson, M., Warner, A. (2011). Training Students to Evaluate Search Engines. In: Efthimiadis, E., Fernández-Luna, J., Huete, J., MacFarlane, A. (eds) Teaching and Learning in Information Retrieval. The Information Retrieval Series, vol 31. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-22511-6_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-22511-6_12

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-22510-9

  • Online ISBN: 978-3-642-22511-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics