Abstract
Discussion of relevance has permeated the information science literature for the past 50+ years, and yet we are no closer to resolution of the matter. In this research we developed a set of measures to operationalize the dimensions underpinning Saracevic’s manifestations of relevance. We used an existing data set collected from 48 participants who used a web search engine to complete four search tasks that represent four subject domains. From this study which had assessed multiple aspects of the search process – from cognitive to behavioural – we derived a set of measures for cognitive, motivational, situational, topical and system relevances. Using regression analysis, we demonstrate how the measures partially predict search success, and additionally use factor analysis to identify the underlying constructs of relevance. The results show that Saracevic’s five manifestations may be merged into three types that represent the user, system and the task.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Barry, C.L., Schamber, L.: Users’ criteria for relevance evaluation: A cross-situational comparison. Inform Process Manag. 34, 219–236 (1998)
Borlund, P.: The concept of relevance in IR. J. Am. Soc. Inform. Sci. 54(10), 913–925 (2003)
Borlund, P., Ingwersen, P.: The development of a method for the evaluation of interactive information retrieval systems. J. Doc. 53(3), 225–250 (1997)
Cleverdon, C.W.: Information and its retrieval. In: ASLIB Proc., vol. 22, pp. 538–549 (1960)
Cosijn, E., Ingwersen, P.: Dimensions of relevance. Inform Process Manag. 36, 533–550 (2000)
DeLone, W.H., McLean, E.R.: Information systems success: the quest for the dependent variable. Inform. Syst. Res. 3(1), 60–95 (1992)
Delone, W.H., McLean, E.R.: Information systems success revisited. In: 35th HICSS Proc. (2002)
Greisdorf, H.: Relevance thresholds: a multi-stage predictive model of how users evaluate information. Inform Process Manag. 39(3), 403–423 (2003)
Harter, S.: Psychological relevance and information science. J. Am. Soc. Inform. Sci. 43, 602–615 (1992)
Harter, S.P., Hert, C.A.: Evaluation of information retrieval systems: Approaches, issues, and methods. Annu. Rev. Inform. Sci. 32, 3–94 (1997)
Ingwersen, P.: Cognitive perspectives of information retrieval interaction: elements of a cognitive IR theory. J. Doc. 52(1), 3–50 (1996)
ISO.: Ergonomic requirements for office work with visual display terminals (VDTs): Part 11.Guidance on usability. ISO 9241-11-1998 (1998)
Kekäläinen, J., Järvelin, K.: Using graded relevance assessments in IR evaluation. J. Am. Soc. Inform. Sci. 53(13), 1120–1129 (2002)
Maglaughlin, K.L., Sonnenwald, D.H.: User perspectives on relevance criteria: A comparison among relevant, partially relevant, and not-relevant judgments. J. Am. Soc. Inform. Sci. 53(5), 327–342 (2002)
Mizzaro, S.: How many relevances in information retrieval? Interact Comput. 10(3), 303–320 (1998)
Reid, J.: A new task-oriented paradigm for information retrieval: implications for evaluation of information retrieval systems. In: CoLIS3 Proc., pp. 97–108 (1999)
Saracevic, T.: Relevance: a review of and a framework for the thinking on the notionsin information science. J. Am. Soc. Inform. Sci. 26, 321–343 (1975)
Saracevic, T.: Relevance reconsidered. In: CoLIS Proc., vol. 2, pp. 201–218 (1996)
Schamber, L.: Relevance and information behavior. ARIST, 3-48 (1994)
Schamber, L., Eisenberg, M.B., Nilan, M.S.: A re-examination of relevance: toward a dynamic, situational definition. Inform Process Manag. 26, 755–775 (1990)
Spink, A., Greisdorf, H.: Regions and levels: Measuring and mapping users’ relevance judgments. J. Am. Soc. Inform. Sci. 52(2), 161–173 (2001)
Spink, A., Greisdorf, H., Bateman, J.: From highly relevant to not relevant: examining different regions of relevance. Inform Process Manag. 34, 599–621 (1998)
Su, L.T.: Evaluation measures for interactive information retrieval. Inform Process Manag. 28(4), 503–516 (1992)
Tabachnick, B.G., Fidell, L.S.: Using multivariate statistics, 4th edn. Allyn & Bacon (2001)
Tague-Sutcliffe, J.: Measuring Information. Academic Press, New York (1995)
Tague-Sutcliffe, J., Toms, E.G.: Information systems design via the quantitative analysis of user transaction logs. In: Presented at the 5th ICSI, River Forest, Illinois (1995)
Tang, R., Solomon, P.: Towards an understanding of the dynamics of relevance judgments: an analysis of one person’s search behavior. Inform Process Manag. 34, 237–256 (1998)
Toms, E.G., Freund, L., Kopak, R., Bartlett, J.C.: The effect of task domain on search. In: CASCON, IBM, Toronto, pp. 303–312 (2003)
Vakkari, P., Sormunen, E.: The influence of relevance levels on the effectiveness of interactive information retrieval. J. Am. Soc. Inform. Sci. 55(11), 963–969 (2004)
Wildemuth, B.M., Barry, C., Luo, L., Oh, S.: Establishing a research agenda for studies of online search behaviors: a Delphi sStudy (2004), for details of the study, and preliminary reports http://ils.unc.edu/sig_use_delphi/
Yuan, W., Meadow, C.T.: A study of the use of variables in information retrieval user studies. J. Am. Soc. Iinform. Sci. 50, 140–150 (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Toms, E.G., O’Brien, H.L., Kopak, R., Freund, L. (2005). Searching for Relevance in the Relevance of Search. In: Crestani, F., Ruthven, I. (eds) Context: Nature, Impact, and Role. CoLIS 2005. Lecture Notes in Computer Science, vol 3507. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11495222_7
Download citation
DOI: https://doi.org/10.1007/11495222_7
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-26178-0
Online ISBN: 978-3-540-32101-9
eBook Packages: Computer ScienceComputer Science (R0)