skip to main content
10.1145/2348283.2348422acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
abstract

Relevance as a subjective and situational multidimensional concept

Published: 12 August 2012 Publication History

Abstract

Relevance is the central concept of information retrieval. Although its important role is unanimously accepted among researchers, numerous different definitions of the term have emerged over the years. Considerable effort has been put into creating consistent and universally applicable descriptions of relevance in the form of relevance frameworks. Across these various formal systems of relevance, a wide range of relevance criteria has been identified. The probably most frequently used single criterion, that in some applications even becomes a synonym for relevance, is topicality. It expresses a document's topical overlap with the user's information need. For textual resources, it is often estimated based on term co-occurrences between query and document. There is, however, a significant number of further noteworthy relevance criteria. Prominent specimen are: (Currency) determines how recent and up to date the document is. Outdated information may have become invalid over time.
(Availability) expresses how easy it is to obtain the document. Users might not want to invest more than a threshold amount of resources (e.g., disk space, downloading time or money) to get the document.
(Readability) describes the document's readability and understandability. A document with a high topical relevance towards a given information need can become irrelevant if the user is not able to extract the desired information from it.
(Credibility) contains criteria such as the document author's expertise, the publication's reputation and the document's general trustworthiness.
(Novelty) describes the document's contribution to satisfying an information need with respect to the user's context. E.g., previous search results or general knowledge about the domain.
It is evident that these criteria can have very different scopes. Some of them are static characteristics of the document or the author, others depend on the concrete information need at hand or even the user's search context. Currently, state-of-the-art retrieval models often treat relevance (regardless which interpretation of the term was chosen) as an atomic concept that can be expressed through topical overlap between document and query or a plain linear combination of multiple scores. Considering the broad audiences a web search engine has to serve, such a method does not seem optimal as the concrete composition of relevance will vary from person to person depending on social and educational context. Furthermore, each individual can be expected to have situational preference for certain combinations of relevance facets depending on the information need at hand. We investigate combination schemes which respect the dimension-specific relevance distributions. In particular, we developed a risk-aware method of combining relevance criteria inspired by the economic Portfolio theory. As a first stage, we applied this method for result set diversification across dimensions.

References

[1]
M.J. Bates. Information search tactics. JASIS, 30(4), 1979.
[2]
P. Borlund. The concept of relevance in IR. JASIST, 2003.
[3]
C. Macdonald, I. Ounis, and I. Soboroff. Overview of the trec blog track 2009. In TREC 2009, Proceedings, 2009.
[4]
S. Mizzaro. Relevance: The whole history. JASIS, 1997.
[5]
T.K. Park. The nature of relevance in information retrieval: An empirical study. The library quarterly, 63(3), 1993.
[6]
S. Robertson, H. Zaragoza, and M. Taylor. Simple BM25 extension to multiple weighted fields. In CIKM 2004.
[7]
L. Schamber and J. Bateman. User criteria in relevance evaluation: Toward development of a measurement scale. In American Society for Information Science, 1996.

Cited By

View all
  • (2017)Understanding human quality judgment in assessing online forum contents for thread retrieval purpose10.1063/1.5005400(020067)Online publication date: 2017

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGIR '12: Proceedings of the 35th international ACM SIGIR conference on Research and development in information retrieval
August 2012
1236 pages
ISBN:9781450314725
DOI:10.1145/2348283

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 12 August 2012

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. relevance models
  2. score combination
  3. search personalisation

Qualifiers

  • Abstract

Conference

SIGIR '12
Sponsor:

Acceptance Rates

Overall Acceptance Rate 792 of 3,983 submissions, 20%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)4
  • Downloads (Last 6 weeks)2
Reflects downloads up to 28 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2017)Understanding human quality judgment in assessing online forum contents for thread retrieval purpose10.1063/1.5005400(020067)Online publication date: 2017

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media