skip to main content
10.1145/2808194.2809493acmconferencesArticle/Chapter ViewAbstractPublication PagesictirConference Proceedingsconference-collections
short-paper

Pooling for User-Oriented Evaluation Measures

Published:27 September 2015Publication History

ABSTRACT

Traditional TREC-style pooling methodology relies on using predicted relevance by systems to select documents for judgment. This coincides with typical search behaviour (e.g., web search). In the case of temporally ordered streams of documents, the order that users encounter documents is in this temporal order and not some predetermined rank order. We investigate a user oriented pooling methodology focusing on the documents that simulated users would likely read in such temporally ordered streams. Under this user model, many of the relevant documents found in the TREC 2013 Temporal Summarization Track's pooling effort would never be read. Not only does our pooling strategy focus on pooling documents that will be read by (simulated) users, the resultant pools are different from the standard TREC pools.

References

  1. J. Allan, R. Gupta, and V. Khandelwal. Temporal Summaries of New Topics. In SIGIR, pp. 10--18, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. J. Aslam, F. Diaz, M. Ekstrand-Abueg, V. Pavlu, and T. Sakai. TREC 2013 Temporal Summarization. In TREC, 2013.Google ScholarGoogle Scholar
  3. G. Baruah, M. D. Smucker, and C. L. A. Clarke. Evaluating Streams of Evolving News Events. In Proc. SIGIR, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. C. Buckley, D. Dimmick, I. Soboroff, and E. Voorhees. Bias and the Limits of Pooling. In Proc. SIGIR, pp. 619--620, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. B. Carterette, J. Allan, and R. Sitaraman. Minimal Test Collections for Retrieval Evaluation. In Proc. SIGIR, pp. 268--275, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. C. L. A. Clarke, L. Freund, M. D. Smucker, and E. Yilmaz. Report on the SIGIR 2013 Workshop on Modeling User Behavior for Information Retrieval Evaluation (MUBE 2013). SIGIR Forum, 47(2):84--95, Jan. 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. C. L. A. Clarke and M. D. Smucker. Time Well Spent. In IIiX'14, pp. 205--214, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. G. V. Cormack and T. R. Lynam. Power and Bias of Subset Pooling Strategies. In Proc. SIGIR, pp. 837--838, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. M. Johanson. Facebook changes: Users overwhelmingly 'dislike' new news feed and ticker, Sept. 2011. International Business Times. http://www.ibtimes.com/facebook-changes-users-overwhelmingly-dislike-new-news-feed-ticker-317124.Google ScholarGoogle Scholar
  10. D. D. Lewis. The TREC-4 Filtering Track. In Proc. TREC-4, 1995.Google ScholarGoogle Scholar
  11. R. McCreadie, C. Macdonald, and I. Ounis. Incremental Update Summarization: Adaptive Sentence Selection Based on Prevalence and Novelty. In CIKM, pp. 301--310, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. I. Soboroff, I. Ounis, J. Lin, and I. Soboroff. Overview of the TREC-2012 Microblog Track. In TREC, 2012.Google ScholarGoogle Scholar
  13. K. Sparck-Jones and C. Van Rijsbergen. Report on the need for and provision of an "ideal" Information Retrieval Test Collection. 1975.Google ScholarGoogle Scholar
  14. E. M. Voorhees and D. K. Harman. TREC: Experiment and Evaluation in Information Retrieval. The MIT Press, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. J. Zobel. How Reliable are the Results of Large-scale Information Retrieval experiments? In SIGIR, pp. 307--314, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Pooling for User-Oriented Evaluation Measures

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ICTIR '15: Proceedings of the 2015 International Conference on The Theory of Information Retrieval
      September 2015
      402 pages
      ISBN:9781450338332
      DOI:10.1145/2808194

      Copyright © 2015 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 27 September 2015

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • short-paper

      Acceptance Rates

      ICTIR '15 Paper Acceptance Rate29of57submissions,51%Overall Acceptance Rate209of482submissions,43%
    • Article Metrics

      • Downloads (Last 12 months)3
      • Downloads (Last 6 weeks)1

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader