skip to main content
10.1145/3639479.3639513acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmlnlpConference Proceedingsconference-collections
research-article

Entity relation extraction based on Biaffine model with embedded context features

Published:28 February 2024Publication History

ABSTRACT

The extraction of entities and relationships from unstructured text is not only a critical issue in information extraction, but also an essential component of constructing knowledge graphs. Most existing models for relation extraction first extract all subjects, and then proceed to extract objects and relations based on the identified subjects. This method is highly dependent on the extraction of subjects, and the quality of subject extraction significantly impacts the extraction of objects and relations. During relation processing, most models only use the simplest classifier, with the difference lying in the input for relation processing, which sometimes leads to the mutual relationship between the subject and the object being overlooked.To avoid overlooking the inherent features of input sentences, we generate three distinct token representation sequences for subject, object, and relation at the encoding layer. Additionally, combining explicit injection of context features from the encoding layer, entity extraction in both directions can mutually promote each other, aiming to obtain as many entity pairs as possible. This approach aims to address the limitations imposed by the relationship being subject to the extracted subject. While most existing models use a simple classifier for handling relationships, our model employs the Biaffine model combined with an attention mechanism to assign all potential relationships for each entity pair. Its advantage over other simple classifiers is maintaining a matrix for each relation, accurately modeling the features of relations, and its probability calculation mechanism can precisely explore the interaction between subjects and objects.We evaluated the proposed model on the NYT and WebNLG datasets. Extensive experimental results demonstrate the high effectiveness of the proposed model, achieving significant performance across all datasets.

References

  1. [1]Li F, Zhang M, Fu G, et al. A Bi-LSTM-RNN model for relation classification using low-cost sequence features[J]. ar**v preprint ar**v:1608.07720, 2016.Google ScholarGoogle Scholar
  2. [2]Xu Y, Mou L, Li G, et al. Classifying relations via long short term memory networks along shortest dependency paths[C]//Proceedings of the 2015 conference on empirical methods in natural language processing. 2015: 1785-1794.Google ScholarGoogle Scholar
  3. [3]Ren F, Zhang L, Yin S, et al. A novel global feature-oriented relational triple extraction model based on table filling[J]. ar**v preprint ar**v:2109.06705, 2021.Google ScholarGoogle Scholar
  4. [4]Wang Y, Sun C, Wu Y, et al. UniRE: A unified label space for entity relation extraction[J]. ar**v preprint ar**v:2107.04292, 2021.Google ScholarGoogle Scholar
  5. [5]Dixit K, Al-Onaizan Y. Span-level model for relation extraction[J]. 2019.Google ScholarGoogle ScholarCross RefCross Ref
  6. [6]Wang Y, Yu B, Zhang Y, et al. TPLinker: Single-stage joint extraction of entities and relations through token pair linking[J]. ar**v preprint ar**v:2010.13415, 2020.Google ScholarGoogle Scholar
  7. [7]Wei Z, Su J, Wang Y, et al. A novel cascade binary tagging framework for relational triple extraction[J]. ar**v preprint ar**v:1909.03227, 2019.Google ScholarGoogle Scholar
  8. [8]Zheng H, Wen R, Chen X,et al.PRGC: Potential Relation and Global Correspondence Based Joint Relational Triple Extraction[J]. 2021.DOI:10.18653/v1/2021.acl-long.486.Google ScholarGoogle ScholarCross RefCross Ref
  9. [9]Ren X, Wu Z, He W, et al. Cotype: Joint extraction of typed entities and relations with knowledge bases[C]//Proceedings of the 26th international conference on world wide web. 2017: 1015-1024.Google ScholarGoogle Scholar
  10. [10]Gormley M R, Yu M, Dredze M. Improved relation extraction with feature-rich compositional embedding models[J]. ar**v preprint ar**v:1505.02419, 2015.Google ScholarGoogle Scholar
  11. [11]Yu B, Zhang Z, Shu X, et al. Joint extraction of entities and relations based on a novel decomposition strategy[J]. ar**v preprint ar**v:1909.04273, 2019.Google ScholarGoogle Scholar
  12. [12]Li X, Yin F, Sun Z, et al. Entity-relation extraction as multi-turn question answering[J]. ar**v preprint ar**v:1905.05529, 2019.Google ScholarGoogle Scholar
  13. [13]Zeng X, Zeng D, He S, et al. Extracting relational facts by an end-to-end neural model with copy mechanism[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2018: 506-514.Google ScholarGoogle Scholar
  14. [14]Devlin J, Chang M W, Lee K, et al. Bert: Pre-training of deep bidirectional transformers for language understanding[J]. ar**v preprint ar**v:1810.04805, 2018.Google ScholarGoogle Scholar
  15. [15]Pennington J, Socher R, Manning C D. Glove: Global vectors for word representation[C]//Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP). 2014: 1532-1543.Google ScholarGoogle Scholar
  16. [16]Zhong P, Wang D, Miao C. Knowledge-enriched transformer for emotion detection in textual conversations[J]. ar**v preprint ar**v:1909.10681, 2019.Google ScholarGoogle Scholar
  17. [17]Sun K, Zhang R, Mensah S, et al. Progressive multi-task learning with controlled information flow for joint entity and relation extraction[C]//Proceedings of the AAAI Conference on Artificial Intelligence. 2021, 35(15): 13851-13859.Google ScholarGoogle ScholarCross RefCross Ref
  18. [18]Yu J, Bohnet B, Poesio M. Named entity recognition as dependency parsing[J]. ar**v preprint ar**v:2005.07150, 2020.Google ScholarGoogle Scholar
  19. [19]Riedel S, Yao L, McCallum A. Modeling relations and their mentions without labeled text[C]//Machine Learning and Knowledge Discovery in Databases: European Conference, ECML PKDD 2010, Barcelona, Spain, September 20-24, 2010, Proceedings, Part III 21. Springer Berlin Heidelberg, 2010: 148-163.Google ScholarGoogle Scholar
  20. [20]Gardent C, Shimorina A, Narayan S, et al. Creating training corpora for nlg micro-planning[C]//55th annual meeting of the Association for Computational Linguistics (ACL). 2017.Google ScholarGoogle Scholar

Index Terms

  1. Entity relation extraction based on Biaffine model with embedded context features
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Other conferences
          MLNLP '23: Proceedings of the 2023 6th International Conference on Machine Learning and Natural Language Processing
          December 2023
          252 pages
          ISBN:9798400709241
          DOI:10.1145/3639479

          Copyright © 2023 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 28 February 2024

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article
          • Research
          • Refereed limited
        • Article Metrics

          • Downloads (Last 12 months)10
          • Downloads (Last 6 weeks)4

          Other Metrics

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format .

        View HTML Format