Skip to main content
Log in

A novel entity joint annotation relation extraction model

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

The social network is an indispensable part of our life. Text is the most common carrier in social networks. Extracting entities and relationships from a text in social media can help to mine people’s views and attitudes. However, identifying the entity pairs that overlap between multiple relations in a sentence and the subject and object that overlap in a relation is a tricky question to be solved urgently. We propose a new relation extraction model named GraphJoint, which models the relation extraction task as a mapping from the relation to the entity. Firstly, we apply the pre-trained BERT encoder to encode the words and generate a text graph for each sentence. We use the graph neural network message-passing mechanism to extract the text features in a sentence, which are used to classify the relations in the sentences. Secondly, we reuse the extracted features and add the relation features to extract the entities. The self-attention mechanism and dilated gate convolution are used to extract entity features further. Finally, we use the joint annotation method to mark the head, tail, and overlapping parts of the subject and the object and transform the task into a sequence labeling task. Experiments compared with other advanced algorithms on two public data sets prove that our method increases the F1 value of the two data sets by 3.6% and 3.4% and achieves a perfect recognition effect in the recognition of overlapping entity pairs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Auer S, Bizer C, Kobilarov G, Lehmann J, Ives ZG (2007) DBpedia: a nucleus for a web of open data. In: Proceedings of the Semantic Web, 6th International Semantic Web Conference, 2nd Asian Semantic Web Conference, pp. 722–735

  2. Dong X, Gabrilovich E, Heitz G, Horn W, Lao N, Murphy K, Strohmann T, Sun S, Zhang W (2014) Knowledge vault: a web-scale approach to probabilistic knowledge fusion. In: Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining. ACM, pp. 601–610

  3. CCN MA (2019) A social recommender system using deep architecture and network embedding. Appl Intell 49:1937–1953

    Article  Google Scholar 

  4. Xue D, Wu L, Hong Z, Guo S, Gao L, Wu Z, Zhong X, Sun J (2018) Deep learning-based personality recognition from text posts of online social networks. Appl Intell 48:4232–4246

    Article  Google Scholar 

  5. Mintz M, Bills S, Snow R, Jurafsky D (2009) Distant supervision for relation extraction without labeled data. In: Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP. ACL, pp. 1003–1011

  6. Riedel S, Yao L, McCallum A (2010) Modeling relations and their mentions without labeled text. In: Machine Learning and Knowledge Discovery in Databases. Springer, pp. 148–163

  7. Liu L, Shang J, Ren X, Xu F, Gui H, Peng J, Han J (2018) Empower sequence labeling with task-aware neural language model. In: Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence, pp. 5253–5260

  8. Xue M, Yu B, Liu T, Wang B (2020) Porous Lattice-based Transformer Encoder for Chinese NER arXiv:191102733

  9. Zhang Y, Yang J (2018) Chinese NER using lattice LSTM. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Association for Computational Linguistics. ACL, pp. 1554–1564

  10. Zhou P, Shi W, Tian J, Qi Z (2016) Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). ACL, pp. 207–212

  11. Zelenko D, Aone C, Richardella A (2003) Kernel methods for relation extraction. J Mach Learn Res 3:1083–1106

    MathSciNet  MATH  Google Scholar 

  12. Gormley MR, Yu M, Dredze M (2015) Improved relation extraction with feature-rich compositional embedding models. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. ACL, pp. 1774–1784

  13. Liu L, Ren X, Zhu Q, et al (2017) Heterogeneous supervision for relation extraction: a representation learning approach. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. ACL, pp. 46–56

  14. Miwa M, Bansal M (2016) End-to-end relation extraction using LSTMs on sequences and tree structures. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). ACL, pp. 1105–1116

  15. Zheng S, Wang F, Bao H, Hao Y, Zhou P, Xu B (2017) Joint extraction of entities and relations based on a novel tagging scheme. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). ACL, pp. 1227–1236

  16. Bekoulis G, Deleu J, Demeester T, Develder C (2018) Joint entity recognition and relation extraction as a multi-head selection problem. Expert Syst Appl 114:34–45

    Article  Google Scholar 

  17. Zeng X, Zeng D, He S, Kang L, Zhao J (2018) Extracting relational facts by an end-to-end neural model with copy mechanism. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). ACL, pp. 506–514

  18. Fu T, Li P, Ma W (2019) GraphRel: modeling text as relational graphs for joint entity and relation extraction. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. ACL, pp. 1409–1418

  19. Dai D, Xiao X, Lyu Y, Dou S, Wang H (2019) Joint extraction of entities and overlapping relations using position-attentive sequence labeling. In: AAAI, pp. 6300–6308

  20. Li X, Yin F, Sun Z, Li X, Yuan A, Chai D, Zhou M, Li J (2019) Entity-relation extraction as multi-turn question answering. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. ACL, pp. 1340–1350

  21. Zeng X, He S, Zeng D, Liu K, Zhao J (2019) Learning the extraction order of multiple relational facts in a sentence with reinforcement learning. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). ACL, pp. 367–377

  22. Shahbazi H, Fern X, Ghaeini R, Tadepalli P (2020) Relation extraction with explanation. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. ACL, pp. 6488–6494

  23. Yu B, Zhang Z, Shu X, Wang Y, Liu T, Wang B, Li S (2020) Joint extraction of entities and relations based on a novel decomposition strategy. In: ECAI, pp. 2282–2289

  24. Wei Z, Su J, Wang Y, Tian Y, Chang Y (2020) A novel Cascade binary tagging framework for relational triple extraction. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. ACL, pp. 1476–1488

  25. Nayak T, Ng H (2020) Effective modeling of encoder-decoder architecture for joint entity and relation extraction. The thirty-fourth AAAI conference on artificial intelligence. AAAI, pp. 8528–8535

  26. Yao L, Mao C, Luo Y (2019) Graph convolutional networks for text classification. AAAI, In, pp 7370–7377

    Google Scholar 

  27. Huang L, Ma D, Li S, Zhang X, Wang H (2019) Text level graph neural network for text classification. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). ACL, pp. 3444–3450

  28. Gardent C, Shimorina A, Narayan S, Perez-Beltrachini L (2017) Creating training corpora for NLG Micro-planners. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). ACL, pp. 179–188

  29. Dai C, Wu J, Pi D, Becker S, Johnson B (2020) Brain EEG time-series clustering using maximum-weight clique. IEEE transactions on. Cybernetics:1–15

  30. Dai C, Pi D, Becker S, Wu J, Johnson B (2020) CenEEGs: valid EEG selection for classification. ACM Trans Knowl Discov Data 14(2):18:1-18:25

    Article  Google Scholar 

  31. Shao W, Pi D, Shao Z (2019) A Pareto-based estimation of distribution algorithm for solving multiobjective distributed no-wait flow-shop scheduling problem with sequence-dependent setup time. IEEE Trans Autom Sci Eng 16:1344–1360

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dechang Pi.

Ethics declarations

Conflict of interest

The authors declared that they have no conflicts of interest in this work. We declare that we do not have any commercial or associative interest that represents a conflict of interest in connection with the work submitted.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xu, M., Pi, D., Cao, J. et al. A novel entity joint annotation relation extraction model. Appl Intell 52, 12754–12770 (2022). https://doi.org/10.1007/s10489-021-03002-0

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-021-03002-0

Keywords

Navigation