skip to main content
10.1145/3331184.3331270acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
short-paper

Time-Limits and Summaries for Faster Relevance Assessing

Published: 18 July 2019 Publication History

Abstract

Relevance assessing is a critical part of test collection construction as well as applications such as high-recall retrieval that require large amounts of relevance feedback. In these applications, tens of thousands of relevance assessments are required and assessing costs are directly related to the speed at which assessments are made. We conducted a user study with 60 participants where we investigated the impact of time limits (15, 30, and 60 seconds) and document size (full length vs. short summaries) on relevance assessing. Participants were shown either full documents or document summaries that they had to judge within a 15, 30, or 60 seconds time constraint per document. We found that using a time limit as short as 15 seconds or judging document summaries in place of full documents could significantly speed judging without significantly affecting judging quality. Participants found judging document summaries with a 60 second time limit to be the easiest and best experience of the six conditions. While time limits may speed judging, the same speed benefits can be had with high quality document summaries while providing an improved judging experience for assessors.

References

[1]
Allan, J., E. Kanoulas, D. Li, C. V. Gysel, D. Harman, and E. Voorhees (2017). Trec 2017 common core track overview. In TREC.
[2]
Bates, D., M. Mächler, B. Bolker, and S. Walker (2015). Fitting linear mixed-effects models using lme4. Journal of Statistical Software 67(1), 1--48.
[3]
Cormack, G. V., C. L. Clarke, and S. Buettcher (2009). Reciprocal rank fusion outperforms condorcet and individual rank learning methods. In SIGIR, pp. 758--759.
[4]
Heitz, R. P. (2014). The speed-accuracy tradeoff: history, physiology, methodology, and behavior. Frontiers in Neuroscience 8, 150.
[5]
Macmillan, N. and C. Creelman (2005). Detection theory: a user's guide.
[6]
Maddalena, E., M. Basaldella, D. De Nart, D. Degl'Innocenti, S. Mizzaro, and G. Demartini (2016). Crowdsourcing relevance assessments: The unexpected benefits of limiting the time to judge. In Fourth AAAI Conf. on Human Comp. and Crowdsourcing.
[7]
Mani, I., G. Klein, D. House, L. Hirschman, T. Firmin, and B. Sundheim (2002). Summac: a text summarization evaluation. Natural Language Engineering 8(1), 43--68.
[8]
Oard, D. W. and W. Webber (2013). Information retrieval for e-discovery. Foundations and Trends in Information Retrieval 7(2--3), 99--237.
[9]
Sanderson, M. (2010). Test collection based evaluation of information retrieval systems. Foundations and Trends in Information Retrieval 4(4), 247--375.
[10]
Smucker, M. D. and C. P. Jethani (2011). The crowd vs. the lab: A comparison of crowd-sourced and university laboratory participant behavior. In Proceedings of the SIGIR 2011 Workshop on crowdsourcing for information retrieval.
[11]
Wang, J. (2011). Accuracy, agreement, speed, and perceived difficulty of users' relevance judgments for e-discovery. In Proceedings of SIGIR Information Retrieval for E-Discovery Workshop, Volume 1.
[12]
Wang, J. and D. Soergel (2010). A user study of relevance judgments for e-discovery. Proc. of the Assoc. for Information Sci. and Tech. 47(1), 1--10.
[13]
Zhang, H. (2019). Increasing the Efficiency of High-Recall Information Retrieval. PhD thesis, University of Waterloo.
[14]
Zhang, H., M. Abualsaud, N. Ghelani, A. Ghosh, M. D. Smucker, G. V. Cormack, and M. R. Grossman (2017). U Waterloo MDS at the TREC 2017 common core track.InTREC.

Cited By

View all
  • (2020)Computer-Assisted Relevance Assessment: A Case Study of Updating Systematic Medical ReviewsApplied Sciences10.3390/app1008284510:8(2845)Online publication date: 20-Apr-2020

Index Terms

  1. Time-Limits and Summaries for Faster Relevance Assessing

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SIGIR'19: Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval
    July 2019
    1512 pages
    ISBN:9781450361729
    DOI:10.1145/3331184
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 18 July 2019

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. document summaries
    2. relevance assessing
    3. time limits

    Qualifiers

    • Short-paper

    Funding Sources

    • Natural Sciences and Engineering Research Council of Canada

    Conference

    SIGIR '19
    Sponsor:

    Acceptance Rates

    SIGIR'19 Paper Acceptance Rate 84 of 426 submissions, 20%;
    Overall Acceptance Rate 792 of 3,983 submissions, 20%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)4
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 25 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2020)Computer-Assisted Relevance Assessment: A Case Study of Updating Systematic Medical ReviewsApplied Sciences10.3390/app1008284510:8(2845)Online publication date: 20-Apr-2020

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media