Skip to main content

Off-line Evaluation of Recommendation Functions

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3538))

Abstract

This paper proposes a novel method for assessing the performance of any Web recommendation function (ie user model), M, used in a Web recommender sytem, based on an off-line computation using labeled session data. Each labeled session consists of a sequence of Web pages followed by a page p \(^{\rm ({\it IC})}\) that contains information the user claims is relevant. We then apply M to produce a corresponding suggested page p \(^{\rm ({\it S})}\). In general, we say that M is good if p \(^{\rm ({\it S})}\) has content “similar” to the associated p \(^{\rm ({\it IC})}\), based on the the same session. This paper defines a number of functions for estimating this p \(^{\rm ({\it S})}\) to p \(^{\rm ({\it IC})}\) similarity that can be used to evaluate any new models off-line, and provides empirical data to demonstrate that evaluations based on these similarity functions match our intuitions.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Agrawal, R., Srikant, R.: Fast algorithms for mining association rules. In: VLDB 1994 (1994)

    Google Scholar 

  2. Billsus, D., Pazzani, M.: A hybrid user model for news story classification. In: UM 1999 (1999)

    Google Scholar 

  3. Kobsa, A., Fink, J.: Performance evaluation of user modeling servers under real-world workload conditions. In: Brusilovsky, P., Corbett, A.T., de Rosis, F. (eds.) UM 2003. LNCS, vol. 2702. Springer, Heidelberg (2003)

    Google Scholar 

  4. Lin, D.: An information-theoretic definition of similarity. In: ICML 1998 (1998)

    Google Scholar 

  5. Ortigosa, A., Carro, R.: Agent-based support for continuous evaluation of e-courses. In: SCI 2002, Orlando, Florida, vol. 2, pp. 477–480 (2002)

    Google Scholar 

  6. Ortigosa, A., Carro, R.: The continuous empirical evaluation approach: Evaluating adaptive web-based courses. In: Brusilovsky, P., Corbett, A.T., de Rosis, F. (eds.) UM 2003. LNCS, vol. 2702, pp. 163–167. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  7. Weibelzahl, S., Weber, G.: Evaluating the inference mechanism of adaptive learning systems. In: Brusilovsky, P., Corbett, A.T., de Rosis, F. (eds.) UM 2003. LNCS, vol. 2702, Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  8. Zhu, T.: http://www.web-ic.com/

  9. Zhu, T., Greiner, R., Häubl, G.: An effective complete-web recommender system. In: WWW 2003 (2003)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zhu, T., Greiner, R., Häubl, G., Jewell, K., Price, B. (2005). Off-line Evaluation of Recommendation Functions. In: Ardissono, L., Brna, P., Mitrovic, A. (eds) User Modeling 2005. UM 2005. Lecture Notes in Computer Science(), vol 3538. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11527886_44

Download citation

  • DOI: https://doi.org/10.1007/11527886_44

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-27885-6

  • Online ISBN: 978-3-540-31878-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics