skip to main content
10.1145/3209978.3210139acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
short-paper

A Test Collection for Coreferent Mention Retrieval

Published: 27 June 2018 Publication History

Abstract

This paper introduces the coreferent mention retrieval task, in which the goal is to retrieve sentences that mention a specific entity based on a query by example in which one sentence mentioning that entity is provided. The development of a coreferent mention retrieval test collection is then described. Results are presented for five coreferent mention retrieval systems, both to illustrate the use of the collection and to specify the results that were pooled on which human coreference judgments were performed. The new test collection is built from content that is available from the Linguistic Data Consortium; the partitioning and human annotations used to create the test collection atop that content are being made freely available.

References

[1]
Javier Artiles, Julio Gonzalo, and Satoshi Sekine . 2007. The semeval-2007 weps evaluation: Establishing a benchmark for the web people search task. In Proceedings of the 4th International Workshop on Semantic Evaluations. ACL, 64--69.
[2]
Amit Bagga and Breck Baldwin . 1998. Entity-based cross-document coreferencing using the vector space model Proc. ACL, Vol. Vol. 1. 79--85.
[3]
Krisztian Balog, Pavel Serdyukov, and Arjen P de Vries . 2010. Overview of the TREC 2010 entity track. Technical Report. Norwegian University of Science and Technology, Trondheim.
[4]
Chris Buckley, Gerard Salton, and James Allan . 1994. The effect of adding relevance information in a relevance feedback environment Proc. SIGIR. 292--300.
[5]
Tongfei Chen and Benjamin Van Durme . 2017. Discriminative information retrieval for question answering sentence selection Proc. EACL, Vol. Vol. 2. 719--725.
[6]
Heng Ji, Joel Nothman, Ben Hachey, et almbox. . 2014. Overview of tac-kbp2014 entity discovery and linking tasks Proc. Text Analysis Conference (TAC2014). 1333--1339.
[7]
Heng Ji, Joel Nothman, Ben Hachey, and Radu Florian . 2015. Overview of TAC-KBP2015 tri-lingual entity discovery and linking Proceedings of the Eighth Text Analysis Conference (TAC2015).
[8]
James Mayfield, David Alexander, Bonnie J Dorr, Jason Eisner, Tamer Elsayed, Tim Finin, Clayton Fink, Marjorie Freedman, Nikesh Garera, Paul McNamee, et almbox. . 2009. Cross-Document Coreference Resolution: A Key Technology for Learning by Reading. AAAI Spring Symposium: Learning by Reading and Learning to Read, Vol. Vol. 9. 65--70.
[9]
Donald Metzler and W Bruce Croft . 2007. Linear feature-based models for information retrieval. Information Retrieval Vol. 10, 3 (2007), 257--274.
[10]
Travis Wolfe, Mark Dredze, and Benjamin Van Durme . 2017. Pocket Knowledge Base Population. In Proc. ACL, Vol. Vol. 2. 305--310.
[11]
Emine Yilmaz and Javed A Aslam . 2006. Estimating average precision with incomplete and imperfect judgments Proc. CIKM. 102--111.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGIR '18: The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval
June 2018
1509 pages
ISBN:9781450356572
DOI:10.1145/3209978
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 27 June 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. coreference
  2. entity linking
  3. mention retrieval

Qualifiers

  • Short-paper

Conference

SIGIR '18
Sponsor:

Acceptance Rates

SIGIR '18 Paper Acceptance Rate 86 of 409 submissions, 21%;
Overall Acceptance Rate 792 of 3,983 submissions, 20%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 151
    Total Downloads
  • Downloads (Last 12 months)2
  • Downloads (Last 6 weeks)1
Reflects downloads up to 05 Mar 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media