Abstract
Joint extraction of entities and relations is to recognize entities and semantic relations simultaneously, which is significant for knowledge graph construction. Recently, many effective joint models use dependency trees to capture the structural information of sentences. However, most dependency-based methods cannot make full use of the dependency information. This is because these methods just consider the connection information of dependency trees and ignore the influence of different nodes on the connected edges. In this paper, we establish a novel model to extract entities and relations simultaneously by using improved Multi-task Graph Convolutional networks, called MGCN. Specifically, considering the importance of node information, we merge both node and edge information into Graph Convolutional networks (GCN). In addition, in order to recognize the overlapping relations, we propose an efficient strategy to map multiple relational labels of a sentence into a unique code. Finally, we evaluate our joint model on two public datasets, and the experimental results show that our model outperforms the state-of-art models.







Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Quirk C, Poon H (2017) Distant supervision for relation extraction beyond the sentence boundary. In: Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers, pp 1171–1182
Xiong C, Power R, Callan J (2017) Explicit semantic ranking for academic search via knowledge graph embedding. In: Proceedings of the 26th international conference on world wide web, pp 1271–1279
Schlichtkrull M, Kipf TN, Bloem P, Van Den Berg R, Titov I, Welling M (2018) Modeling relational data with graph convolutional networks. In: European Semantic Web Conference, pp 593–607. Springer
Lample G, Ballesteros M, Subramanian S, Kawakami K, Dyer C (2016) Neural architectures for named entity recognition. In: Proceedings of NAACL-HLT, pp 260–270
Rei M, Crichton G, Pyysalo S (2016) Attending to characters in neural sequence labeling models. In: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pp 309–318
Wang L, Cao Z, De Melo G, Liu Z (2016) Relation classification via multi-level attention cnns. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp 1298–1307
Peng N, Poon H, Quirk C, Toutanova K, Yih W-t (2017) Cross-sentence n-ary relation extraction with graph lstms. Transactions of the Association for Computational Linguistics 5:101–115
Zhang Y, Qi P, Manning CD (2018) Graph convolution over pruned dependency trees improves relation extraction. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp 2205–2215
Guo Z, Zhang Y, Lu W (2019) Attention guided graph convolutional networks for relation extraction. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp 241–251
Gormley MR, Yu M, Dredze M (2015) Improved relation extraction with feature-rich compositional embedding models. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp 1774–1784
Miwa M, Bansal M (2016) End-to-end relation extraction using lstms on sequences and tree structures. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp 1105–1116
Li Q, Ji H (2014) Incremental joint extraction of entity mentions and relations. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp 402–412
Ren X, Wu Z, He W, Qu M, Voss CR, Ji H, Abdelzaher TF, Han J (2017) Cotype: Joint extraction of typed entities and relations with knowledge bases. In: Proceedings of the 26th International Conference on World Wide Web, pp 1015–1024
Takanobu R, Zhang T, Liu J, Huang M (2019) A hierarchical framework for relation extraction with reinforcement learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol 33, pp 7072–7079
Wei Z, Jia Y, Tian Y, Hosseini MJ, Steedman M, Chang Y (2019) Joint extraction of entities and relations with a hierarchical multi-task tagging model
Bai C, Pan L, Luo S, Wu Z (2020) Joint extraction of entities and relations by a novel end-to-end model with a double-pointer module. Neurocomputing 377:325–333
Han X, Gao T, Lin Y, Peng H, Yang Y, Xiao C, Liu Z, Li P, Zhou J, Sun M (2020) More data, more relations, more context and more openness: A review and outlook for relation extraction. In: Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing, pp 745–758
Zhai F, Potdar S, Xiang B, Zhou B (2017) Neural models for sequence chunking. In: AAAI
Tran QH, MacKinlay A, Yepes AJ (2017) Named entity recognition with stack residual lstm and trainable bias decoding. In: Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp 566–575
Nguyen TH, Grishman R (2015) Relation extraction: Perspective from convolutional neural networks. In: Proceedings 2014of the 1st Workshop on Vector Space Modeling for Natural Language Processing, pp 39–48
Zeng D, Liu K, Chen Y, Zhao J (2015) Distant supervision for relation extraction via piecewise convolutional neural networks. In: Proceedings of the 2015 conference on empirical methods in natural language processing, pp 1753–1762
Jiang X, Wang Q, Li P, Wang B (2016) Relation extraction with multi-instance multi-label convolutional neural networks. In: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pp 1471–1480
Zhang Y, Zhong V, Chen D, Angeli G, Manning CD (2017) Position-aware attention and supervised data improve slot filling. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp 35–45
Huang YY, Wang WY (2017) Deep residual learning for weakly-supervised relation extraction. In: EMNLP
Xu Y, Mou L, Li G, Chen Y, Peng H, Jin Z (2015) Classifying relations via long short term memory networks along shortest dependency paths. In: Proceedings of the 2015 conference on empirical methods in natural language processing, pp 1785– 1794
Tai KS, Socher R, Manning CD (2015) Improved semantic representations from tree-structured long short-term memory networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp 1556–1566
Zheng S, Wang F, Bao H, Hao Y, Zhou P, Xu B (2017) Joint extraction of entities and relations based on a novel tagging scheme. In: ACL (1)
Dai D, Xiao X, Lyu Y, Dou S, She Q, Wang H (2019) Joint extraction of entities and overlapping relations using position-attentive sequence labeling. In: The conference on artificial intelligence, AAAI, pp 6300–6308
Zeng X, Zeng D, He S, Liu K, Zhao J (2018) Extracting relational facts by an end-to-end neural model with copy mechanism. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp 506– 514
Fu T-J, Li P-H, Ma W-Y (2019) Graphrel: Modeling text as relational graphs for joint entity and relation extraction. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp 1409–1418
Lai Q, Zhou Z, Liu S (2020) Joint entity-relation extraction via improved graph attention networks. Symmetry 12(10):1746
Devlin J, Chang M-W, Lee K, Toutanova K (2019) BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics, pp 4171–4186
Wang Q, Lv L, Yu B, Li S (2020) End-to-end relation extraction using graph convolutional network with a novel entity attention. In: 2020 IEEE 6th International Conference on Computer and Communications (ICCC), pp 2086–2093. IEEE
Qiao B, Zou Z, Huang Y, Fang K, Zhu X, Chen Y (2021) A joint model for entity and relation extraction based on bert. Neural Comput & Applic, pp 1–11
Pennington J, Socher R, Manning CD (2014) Glove: Global vectors for word representation. In: Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), pp 1532–1543
Liu Y, Wei F, Li S, Ji H, Zhou M, Wang H (2015) A dependency-based neural network for relation classification. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp 285–290
Marcheggiani D, Titov I (2017) Encoding sentences with graph convolutional networks for semantic role labeling. In: EMNLP
Kipf TN, Welling M (2017) Semi-supervised classification with graph convolutional networks. In: 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, April 24-26, 2017, Conference Track Proceedings. OpenReview.net
Hoffmann R, Zhang C, Ling X, Zettlemoyer L, Weld DS (2011) Knowledge-based weak supervision for information extraction of overlapping relations. In: Proceedings of the 49th annual meeting of the association for computational linguistics: human language technologies, pp 541–550
Hong Y, Liu Y, Yang S, Zhang K, Hu J (2020) Joint extraction of entities and relations using graph convolution over pruned dependency trees. Neurocomputing 411:302–312
Geng Z, Zhang Y, Han Y (2021) Joint entity and relation extraction model based on rich semantics. Neurocomputing 429:132–140
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Sun, Q., Zhang, K., Lv, L. et al. Joint extraction of entities and overlapping relations by improved graph convolutional networks. Appl Intell 52, 5212–5224 (2022). https://doi.org/10.1007/s10489-021-02667-x
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10489-021-02667-x