Skip to main content
Log in

A joint extraction model of entities and relations based on relation decomposition

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

Extracting entities and relations from unstructured text is an essential task in the field of information extraction. Existing work mainly pipeline extraction and joint decoding methods. However, these methods are unable to extract overlapping entities and relations, and ignore the task correlation between entity and relation extraction. In this paper, we first introduce the BERT pre-training model to model the text more finely. Then, we decompose the extraction into relation extraction and entity recognition. Relation extraction is transformed into a relation classification task. Entity recognition is transformed into a sequence labeling task. The recognition entity includes a head entity and a tail entity. We evaluate the model on the New York Times (NYT) and WebNLG datasets. Compared with most existing models, excellent results have been obtained. Experimental results show that our model can fully capture the semantic interdependence between the two tasks of entity and relation extraction, reduce the interference of unrelated entity pairs, and effectively solve the problem of entity overlap.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Zelenko D, Aone C, Richardella A (2003) Kernel methods for relation extraction. J Mach Learn Rese 3:1083–1106

    MathSciNet  MATH  Google Scholar 

  2. Chan YS, Roth D (2011) Exploiting syntactico-semantic structures for relation extraction. In: Proceedings of the 49th annual meeting of the association for computational linguistics, pp. 551–560

  3. Nadeau D, Sekine S (2007) A survey of named entity recognition and classification. Lingvist Investig 30:3–26

    Article  Google Scholar 

  4. Bach N, Badaskar S (2007) A review of relation extraction. Lit Rev Lang Stat 2:1–15

    Google Scholar 

  5. Li Q, Ji H (2014) Incremental joint extraction of entity mentions and relations. In: Proceedings of the 52nd annual meeting of the association for computational linguistics, pp. 402–312

  6. Miwa M, Sasaki Y (2014) Modeling joint entity and relation extraction with table representation. In: Proceedings of the 2014 conference on empirical methods in natural language processing, pp. 1858–1869

  7. Ren X, Wu Z, He W, Qu M, Voss CR, Ji H, (2017) Cotype: joint extraction of typed entities and relations with knowledge bases. In: Proceedings of the 26th international conference on world wide web, pp. 1015–1024

  8. Yu X, Lam W (2010) Jointly identifying entities and extracting relations in encyclopedia text via a graphical model approach. In: Coling 2010: posters, pp. 1399–1407

  9. Huang Z, Xu W, Yu K (2015) Bidirectional lstm-crf models for sequence tagging. arXiv preprint arXiv:1058.01991

  10. Zheng S, Wang F, Bao H, Yue XH, Peng Z, Bo X (2017) Joint extraction of entities and relations based on a novel tagging scheme. arXiv preprint arXiv:1706.05075

  11. Miwa M, Bansal M (2016) End-to-end relation extraction using lstms on sequences and tree structures. In: Proceedings of the fifty-fourth annual meeting of the association for computational linguistics, p. 1105-1116

  12. Li F, Zhang M, Fu G, Ji D (2017) A neural joint model for entity and relation extraction from biomedical text. Physica A 18:1–11

    Google Scholar 

  13. Feng Y, Zhang H, Hao W, Chen G (2017) Joint extraction of entities and relations using reinforcement learning and deep learning. Comput Intell Neurosci. https://doi.org/10.1155/2017/7643065

    Article  Google Scholar 

  14. Katiyar A, Cardie C (2017) Going out on a limb: joint extraction of entity mentions and relations without dependency trees. In: Proceedings of the 55th annual meeting of the association for computational linguistics, pp. 917–928

  15. Zeng XR, Zeng DJ, He SZ, Liu K, Jun Z (2018) Extracting relational facts by an end-to-end neural model with copy mechanism. In: Proceedings of the 56th annual meeting of the association for computational linguistics, pp. 506–514

  16. Zhong Z, Chen D (2021) A frustratingly easy approach for entity and relation extraction. In: Proceedings of the 2021 conference of the North American chapter of the Association for Computational Linguistics: human language technologies

  17. Lafferty J, McCallum A, Pereira FCN (2001) Conditional random fields: probabilistic models for segmenting and labeling sequence data. In: ICML ’01: Proceedings of the eighteenth international conference on machine learning. ACM, pp 282–289

  18. Lample G, Ballesteros M, Subramanian S (2016) Neural architectures for named entity recognition. In: Proceedings of the 2016 conference of the North American chapter of the Association for Computational Linguistics: human language technologies

  19. Xu K, Feng Y, Huang S, Zhao D (2015a) Semantic relation classification via convolutional neural networks with simple negative sampling. In: Proceedings of the 2015 conference on empirical methods in natural language processing, p. 536–540

  20. Zeng D, Liu K, Lai S, Zhou G, Zhao J (2014) Relation classification via convolutional deep neural network. In: Proceedings of the twenty-fifth COLING international conference, p. 2335-2344

  21. Xu Y, Mou L, Li G, Chen Y, Peng H, Jin Z (2015b) Classifying relations via long short term memory networks along shortest dependency paths. In: Proceedings of the 2015 conference on empirical methods in natural language processing, p. 1785–1794

  22. Zhou P, Shi W, Tian J, Qi Z, Li B, Hao H (2016) Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th annual meeting of the association for computational linguistics, pp. 207–212

  23. Zhang T, Lin H, Tadesse M, Ren Y, Duan X, Xu B (2020) Chinese medical relation extraction based on multi-hop self-attention mechanism. Int J Mach Learn Cybern 12:355–363

    Article  Google Scholar 

  24. Zhou Y, Huang L, Guo T, Hu S, Han J (2019) An attention-based model for joint extraction of entities and relations with implicit entity features. In: Proceedings of the 2019 world wide web conference, p. 729-737

  25. Meng Z, Tian S, Yu L, Lv Y (2020) Joint extraction of entities and relations based on character graph convolutional network and multi-head self-attention mechanism. J Exp Theor Artif Intell 33(2):349–362

    Article  Google Scholar 

  26. Luo L, Yang Z, Cao M, Wang L, Zhang Y (2020) A neural network-based joint learning approach for biomedical entity and relation extraction from biomedical literature. J Biomed Inform 103:103384

    Article  Google Scholar 

  27. Bekoulis G, Deleu J, Demeester T (2018) Joint entity recognition and relation extraction as a multi-head selection problem. Expert Syst Appl 114:34–45

    Article  Google Scholar 

  28. Huang W, Cheng X, Wang T, Chu W (2019) Bert-based multi-head selection for joint entity-relation extraction. In: CCF international conference on natural language processing and chinese computing, pp. 713–723

  29. Zeng X, He S, Zeng D (2019) Learning the extraction order of multiple relational facts in a sentence with reinforcement learning. In: Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing, pp. 367–377

  30. Dai D, Xiao X, Lyu Y, Dou S, She Q, Wang H (2019) Joint extraction of entities and overlapping relations using position-attentive sequence labeling. In: Proceedings of the AAAI conference on artificial intelligence, pp. 6300–6308

  31. Zeng D, Zhang H, Liu Q (2020) Copymtl: Copy mechanism for joint extraction of entities and relations with multi-task learning. In: Proceedings of the AAAI conference on artificial intelligence, pp. 9507–9514

  32. Yu B, Zhang Z, Shu X (2020) Joint extraction of entities and relations based on a novel decomposition strategy. In: Proceedings of ECAI

  33. Liu J, Chen S, Wang B, Zhang J, Li N, Xu T (2020) Attention as relation: Learning supervised multi-head self-attention for relation extraction. In: Proceedings of the twenty-ninth international joint conference on artificial intelligence, pp. 1294–1298

  34. Wei Z, Su J, Wang Y, Tian Y, Chang Y (2020) A novel cascade binary tagging framework for relational triple extraction. In: Proceedings of the 58th annual meeting of the association for computational linguistics, pp. 1476–1488

  35. Ye H, Zhang N, Deng S, Chen M, Tan C, Huang F, Chen H (2021) Contrastive triple extraction with generative transformer. In: Proceedings of the AAAI conference on artificial intelligence

  36. Su JL (2019) Conditional layer normalization-based conditional text generation. https://spaces.ac.cn/archives/7124. Accessed 30 Sept 2019

  37. Devlin J, Chang MW, Lee K, Toutanova K (2018) Bert: pre-training of deep bidirectional transformers for language understanding, In: Proceedings of the 58th annual meeting of the association for computationa. l Linguistics. p. 1810.04805

  38. Vaswani A, Shazeer N, Parmar N (2017) Attention is all you need. Adv Neural Inf Process Syst 30:5998–6008

    Google Scholar 

  39. Lin M, Chen Q, Yan S (2013) Network in network. In: Processing of the 2th international conference on learning representations

  40. Lin Z, Feng M, Santos CN (2017) A structured self-attentive sentence embedding. In: Processing of the 5th international conference on learning representations

  41. Riedel S, Yao L, McCallum A (2010) Modeling relations and their mentions without labeled text. In: joint European conference on machine learning and knowledge discovery in databases, pp. 148–163

  42. Takanobu R, Zhang T, Liu J, Huang M (2019) A hierarchical framework for relation extraction with reinforcement learning. In: Proceedings of the AAAI conference on artificial intelligence, p. 7072–7079

  43. Fu TJ, Li PH, Ma WY (2019) Graphrel: modeling text as relational graphs for joint entity and relation extraction. In: Proceedings of the 57th annual meeting of the association for computational linguistics, p. 1409-1418

  44. Bai C, Pan L, Luo S, Wu Z (2020) Joint extraction of entities and relations by a novel end-to-end model with a double-pointer module. Neurocomputing 377:325–333

    Article  Google Scholar 

  45. Nayak T, Ng HT (2020) Effective modeling of encoder-decoder architecture for joint entity and relation extraction. In: Proceedings of the AAAI conference on artificial intelligence, p. 1911.09886

  46. Hoffmann R, Zhang C, Ling X, Zettlemoyer L, Weld DS (2011) Knowledge-based weak supervision for information extraction of overlapping relations. In: Proceedings of the 49th annual meeting of the association for computational linguistics: human language technologies, pp. 541–550

Download references

Acknowledgements

This work is supported by the National Natural Science Foundation of China under Grant No. 61862063, 61502413, 61262025, 62002310. the National Social Science Foundation of China under Grant No. 18BJL104; the Natural Science Foundation of Key Laboratory of Software Engineering of Yunnan Province under Grant No. 2020SE301. Yunnan Science and Technology Major Project under Grant No. 202002AE090010, 202002AD080002-5. the Data Driven Software Engineering Innovative Research Team Funding of Yunnan Province under Grant No. 2017HC012.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xuan Zhang.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gao, C., Zhang, X., Liu, H. et al. A joint extraction model of entities and relations based on relation decomposition. Int. J. Mach. Learn. & Cyber. 13, 1833–1845 (2022). https://doi.org/10.1007/s13042-021-01491-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-021-01491-6

Keywords

Navigation