skip to main content
10.1145/1277741.1277957acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
Article

Effects of highly agreed documents in relevancy prediction

Published: 23 July 2007 Publication History

Abstract

Finding significant contextual features is a challenging task in the development of interactive information retrieval (IR) systems. This paper investigated a simple method to facilitate such a task by looking at aggregated relevance judgements of retrieved documents. Our study suggested that the agreement on relevance judgements can indicate the effectiveness of retrieved documents as the source of significant features. The effect of highly agreed documents gives us practical implication for the design of adaptive search models in interactive IR systems.

References

[1]
E. Agichtein, et al. Learning user interaction models for predicting web search result preferences. In Proceedings of the 29th SIGIR Conference, 3--10, 2006.
[2]
R. O. Duda and P. E. Hart. Pattern Classification and Scene Analysis. John Wiley Sons, New York, 1973.
[3]
S. Fox, K. Karnawat, M. Mydland, S. Dumais, and T. White. Evaluating implicit measures to improve web search. ACM TOIS, 23(2):147--168, 2005.
[4]
L. J. H. Zhang and J. Su. Hidden naive bayes. In Proceedings of AAAI-05. AAAI Press, 919--924, 2005.
[5]
P. Ingwersen and K. Järvelin. Information retrieval in context: IRiX. SIGIR Forum, 39(2):31--39, 2005.
[6]
H. Joho and J. M. Jose. Slicing and dicing the information space using local contexts. In Proceedings of the First IIiX Symposium, 111--126, 2006.
[7]
J. Pearl. Probabilistic Reasoning with Intelligent Systems. Morgan & Kaufman, San Mateo, 1988.
[8]
E. Sormunen. Liberal relevance criteria of TREC : counting on negligible documents? In Proceedings of the 25th SIGIR Conference, 324--330, 2002.
[9]
E. M. Voorhees. Variations in relevance judgments and the measurement of retrieval effectiveness. In Proceedings of the 21st SIGIR conference, 315--323, 1998.
[10]
G. I. Webb et al. Not so naive bayes: aggregating one-dependence estimators. Mach. Learn., 58(1):5--24, 2005.

Index Terms

  1. Effects of highly agreed documents in relevancy prediction

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SIGIR '07: Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval
    July 2007
    946 pages
    ISBN:9781595935977
    DOI:10.1145/1277741
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 23 July 2007

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. highly agreed documents
    2. relevance prediction

    Qualifiers

    • Article

    Conference

    SIGIR07
    Sponsor:
    SIGIR07: The 30th Annual International SIGIR Conference
    July 23 - 27, 2007
    Amsterdam, The Netherlands

    Acceptance Rates

    Overall Acceptance Rate 792 of 3,983 submissions, 20%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 262
      Total Downloads
    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 28 Feb 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media