skip to main content
10.1145/3539618.3591984acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
short-paper

Exploiting Ubiquitous Mentions for Document-Level Relation Extraction

Published: 18 July 2023 Publication History

Abstract

Recent years have witnessed the transition from sentence-level to document-level in relation extraction (RE), with new formulation, new methods and new insights. Yet, the fundamental concept, mention, is not well-considered and well-defined. Current datasets usually use automatically-detected named entities as mentions, which leads to the missing reference problem. We show that such phenomenon hinders models' reasoning abilities. To address it, we propose to incorporate coreferences (e.g. pronouns and common nouns) into mentions, based on which we refine and re-annotate the widely-used DocRED benchmark as R-DocRED. We evaluate various methods and conduct thorough experiments to demonstrate the efficacy of our formula. Specifically, the results indicate that incorporating coreferences helps reduce the long-term dependencies, further improving models' robustness and generalization under adversarial and low-resource settings. The new dataset is made publicly available for future research.

References

[1]
Livio Baldini Soares, Nicholas FitzGerald, Jeffrey Ling, and Tom Kwiatkowski. 2019. Matching the Blanks: Distributional Similarity for Relation Learning. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, Florence, Italy, 2895--2905. https://doi.org/10.18653/v1/P19--1279
[2]
Rui Cai, Xiaodong Zhang, and Houfeng Wang. 2016. Bidirectional Recurrent Convolutional Neural Network for Relation Classification. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Association for Computational Linguistics, Berlin, Germany, 756--765. https://doi.org/10.18653/v1/P16--1072
[3]
M.A.K. Halliday, C.M.I.M. Matthiessen, M. Halliday, and C. Matthiessen. 2014. An Introduction to Functional Grammar. Taylor & Francis.
[4]
Lynette Hirschman and Nancy Chinchor. 1998. Appendix F: MUC-7 Coreference Task Definition (version 3.0). In Seventh Message Understanding Conference (MUC-7): Proceedings of a Conference Held in Fairfax, Virginia, April 29 - May 1, 1998. https://aclanthology.org/M98--1029
[5]
Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long Short-Term Memory. Neural Computation 9, 8 (11 1997), 1735--1780. https://doi.org/10.1162/neco.1997.9.8.1735 arXiv:https://direct.mit.edu/neco/article-pdf/9/8/1735/813796/neco.1997.9.8.1735.pdf
[6]
Quzhe Huang, Shengqi Zhu, Yansong Feng, Yuan Ye, Yuxuan Lai, and Dongyan Zhao. 2021. Three Sentences Are All You Need: Local Path Enhanced Document Relation Extraction. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers). Association for Computational Linguistics, Online, 998--1004. https://doi.org/10.18653/v1/2021.acl-short.126
[7]
Yanzeng Li, Bowen Yu, Li Quangang, and Tingwen Liu. 2021. FITAnnotator: A Flexible and Intelligent Text Annotation System. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Demonstrations. Association for Computational Linguistics, Online, 35--41. https://doi.org/10.18653/v1/2021.naacl-demos.5
[8]
Guoshun Nan, Zhijiang Guo, Ivan Sekulic, and Wei Lu. 2020. Reasoning with Latent Structure Refinement for Document-Level Relation Extraction. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, Online, 1546--1557. https://doi.org/10.18653/v1/2020.acl-main.141
[9]
D. Nouvel, M. Ehrmann, and S. Rosset. 2016. Named Entities for Computational Linguistics. Wiley.
[10]
Hao Peng, Tianyu Gao, Xu Han, Yankai Lin, Peng Li, Zhiyuan Liu, Maosong Sun, and Jie Zhou. 2020. Learning from Context or Names? An Empirical Study on Neural Relation Extraction. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics, Online, 3661--3672. https://doi.org/10.18653/v1/2020.emnlp-main.298
[11]
Nanyun Peng, Hoifung Poon, Chris Quirk, Kristina Toutanova, and Wen-tau Yih. 2017. Cross-Sentence N-ary Relation Extraction with Graph LSTMs. Transactions of the Association for Computational Linguistics 5 (2017), 101--115. https://doi.org/10.1162/tacl_a_00049
[12]
Raymond Reiter. 1978. On Closed World Data Bases. Springer US, Boston, MA, 55--76. https://doi.org/10.1007/978--1--4684--3384--5_3
[13]
Daniil Sorokin and Iryna Gurevych. 2017. Context-Aware Representations for Knowledge Base Relation Extraction. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, Copenhagen, Denmark, 1784--1789. https://doi.org/10.18653/v1/D17--1188
[14]
Qingyu Tan, Ruidan He, Lidong Bing, and Hwee Tou Ng. 2022. Document-Level Relation Extraction with Adaptive Focal Loss and Knowledge Distillation. In Findings of the Association for Computational Linguistics: ACL 2022. Association for Computational Linguistics, Dublin, Ireland, 1672--1681. https://doi.org/10.18653/v1/2022.findings-acl.132
[15]
Yiqing Xie, Jiaming Shen, Sha Li, Yuning Mao, and Jiawei Han. 2022. Eider: Empowering Document-level Relation Extraction with Efficient Evidence Extraction and Inference-stage Fusion. In Findings of the Association for Computational Linguistics: ACL 2022. Association for Computational Linguistics, Dublin, Ireland, 257--268. https://doi.org/10.18653/v1/2022.findings-acl.23
[16]
Yuan Yao, Deming Ye, Peng Li, Xu Han, Yankai Lin, Zhenghao Liu, Zhiyuan Liu, Lixin Huang, Jie Zhou, and Maosong Sun. 2019. DocRED: A Large-Scale Document-Level Relation Extraction Dataset. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, Florence, Italy, 764--777. https://doi.org/10.18653/v1/P19--1074
[17]
Daojian Zeng, Kang Liu, Siwei Lai, Guangyou Zhou, and Jun Zhao. 2014. Relation Classification via Convolutional Deep Neural Network. In Proceedings of COLING 2014, the 25th International Conference on Computational Linguistics: Technical Papers. Dublin City University and Association for Computational Linguistics, Dublin, Ireland, 2335--2344. https://aclanthology.org/C14--1220
[18]
Shuang Zeng, Runxin Xu, Baobao Chang, and Lei Li. 2020. Double Graph Based Reasoning for Document-level Relation Extraction. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics, Online, 1630--1640. https://doi.org/10.18653/v1/2020.emnlp-main.127
[19]
Zhenyu Zhang, Bowen Yu, Xiaobo Shu, Tingwen Liu, Hengzhu Tang, Wang Yubin, and Li Guo. 2020. Document-level Relation Extraction with Dual-tier Heterogeneous Graph. In Proceedings of the 28th International Conference on Computational Linguistics. International Committee on Computational Linguistics, Barcelona, Spain (Online), 1630--1641. https://doi.org/10.18653/v1/2020.coling-main.143
[20]
Wenxuan Zhou, Kevin Huang, Tengyu Ma, and Jing Huang. 2021. Document-Level Relation Extraction with Adaptive Thresholding and Localized Context Pooling. Proceedings of the AAAI Conference on Artificial Intelligence 35, 16 (May 2021), 14612--14620. https://doi.org/10.1609/aaai.v35i16.17717

Cited By

View all
  • (2024)Enhancing Document-Level Relation Extraction with Entity Pronoun Resolution and Relation CorrelationNatural Language Processing and Chinese Computing10.1007/978-981-97-9434-8_14(174-186)Online publication date: 2-Nov-2024

Index Terms

  1. Exploiting Ubiquitous Mentions for Document-Level Relation Extraction

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SIGIR '23: Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval
    July 2023
    3567 pages
    ISBN:9781450394086
    DOI:10.1145/3539618
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 18 July 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. document-level relation extraction
    2. information extraction
    3. natural language processing

    Qualifiers

    • Short-paper

    Funding Sources

    Conference

    SIGIR '23
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 792 of 3,983 submissions, 20%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)42
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 05 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Enhancing Document-Level Relation Extraction with Entity Pronoun Resolution and Relation CorrelationNatural Language Processing and Chinese Computing10.1007/978-981-97-9434-8_14(174-186)Online publication date: 2-Nov-2024

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media