Skip to main content

Advertisement

A prototype evolution network for relation extraction

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Prototypical networks transform relation instances and relation types into the same semantic space, where a relation instance is assigned a type based on the nearest prototype. Traditional prototypical network methods generate relation prototypes by averaging the sentence representations from a predefined support set, which suffers from two key limitations. One limitation is sensitive to the outliers in the support set that can skew the relation prototypes. Another limitation is the lack of the necessary representational capacity to capture the full complexity of the relation extraction task. To address these limitations, we propose the Prototype Evolution Network (PEN) for relation extraction. First, we assign a type cue for each relation instance to mine the semantics of the relation type. Based on the type cues and relation instances, we then present a prototype refiner comprising a multichannel convolutional neural network and a scaling module to learn and refine the relation prototypes. Finally, we introduce historical prototypes during each episode into the current prototype learning process to enable continuous prototype evolution. We evaluate the PEN on the ACE 2005, SemEval 2010, and CoNLL2004 datasets, and the results demonstrate impressive improvements, with the PEN outperforming existing state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data Availability

We got the open SemEval 2010 and CoNLL04 datasets from Hugging Face repository (https://huggingface.co/datasets/joelniklaus/sem_eval_2010_task_8, https://huggingface.co/datasets/DFKI-SLT/conll04), and bought the ACE 2005 dataset from the Linguistic Data Consortium (https://www.ldc.upenn.edu/language-resources/data/obtaining).

References

  1. Alt C, Hübner M, Hennig L (2019) Improving relation extraction by pre-trained language representations. In: Proceedings of the AKBC

  2. Cabot PLH, Navigli R (2021) Rebel: Relation extraction by end-to-end language generation. In: Findings of the EMNLP, pp 2370–2381

  3. Chen X, Zhang N, Xie X, et al (2022) Knowprompt: Knowledge-aware prompt-tuning with synergistic optimization for relation extraction. In: Proceedings of the WWW, pp 2778–2788

  4. Chen Y, Zheng Q, Chen P (2015) Feature assembly method for extracting relations in chinese. Artif Intell 228:179–194

    Article  MathSciNet  MATH  Google Scholar 

  5. Chen Y, Wang K, Yang W et al (2020) A multi-channel deep neural network for relation extraction. IEEE Access 8:13195–13203

    Article  MATH  Google Scholar 

  6. Chen Y, Yang W, Wang K et al (2021) A neuralized feature engineering method for entity relation extraction. Neural Netw 141:249–260

    Article  MATH  Google Scholar 

  7. Chiticariu L, Li Y, Reiss F (2013) Rule-based information extraction is dead! long live rule-based information extraction systems! In: Proceedings of the EMNLP, pp 827–832

  8. Deng S, Zhang N, Kang J, et al (2020) Meta-learning with dynamic-memory-based prototypical network for few-shot event detection. In: Proceedings of the WSDM, pp 151–159

  9. Devlin J, Chang MW, Lee K, et al (2019) Bert: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the NAACL, pp 4171–4186

  10. Ding K, Wang J, Li J, et al (2020) Graph prototypical networks for few-shot learning on attributed networks. In: Proceedings of the CIKM, pp 295–304

  11. Ding N, Wang X, Fu Y, et al (2021) Prototypical representation learning for relation extraction. In: Proceedings of the ICLR

  12. Distiawan B, Weikum G, Qi J, et al (2019) Neural relation extraction for knowledge base enrichment. In: Proceedings of the ACL, pp 229–240

  13. Geng Z, Chen G, Han Y et al (2020) Semantic relation extraction using sequential and tree-structured lstm with attention. Inf Sci 509:183–192

    Article  MATH  Google Scholar 

  14. Geng Z, Zhang Y, Han Y (2021) Joint entity and relation extraction model based on rich semantics. Neurocomputing 429:132–140

    Article  MATH  Google Scholar 

  15. Geng Z, Li J, Han Y et al (2022) Novel target attention convolutional neural network for relation classification. Inf Sci 597:24–37

    Article  MATH  Google Scholar 

  16. Gormley MR, Yu M, Dredze M (2015) Improved relation extraction with feature-rich compositional embedding models. arXiv:1505.02419

  17. Han X, Zhao W, Ding N et al (2022) Ptr: Prompt tuning with rules for text classification. AI Open 3:182–192

    Article  MATH  Google Scholar 

  18. He K, Huang Y, Mao R et al (2023) Virtual prompt pre-training for prototype-based few-shot relation extraction. Expert Syst Appl 213:118927

    Article  Google Scholar 

  19. Hendrickx I, Kim SN, Kozareva Z, et al (2019) Semeval-2010 task 8: Multi-way classification of semantic relations between pairs of nominals. arXiv:1911.10422

  20. Hendrycks D, Gimpel K (2016) Gaussian error linear units (gelus). arXiv:1606.08415

  21. Hu S, Ding N, Wang H, et al (2022) Knowledgeable prompt-tuning: Incorporating knowledge into prompt verbalizer for text classification. In: Proceedings of the ACL, pp 2225–2240

  22. Huang JY, Li B, Xu J, et al (2022) Unified semantic typing with meaningful label inference. In: Proceedings of the NAACL, pp 2642–2654

  23. Joshi M, Chen D, Liu Y et al (2020) Spanbert: Improving pre-training by representing and predicting spans. Trans Assoc Comput Linguist 8:64–77

    Article  MATH  Google Scholar 

  24. Kingma DP, Ba J (2015) Adam: A method for stochastic optimization. In: Proceedings of the ICLR

  25. Lewis M, Liu Y, Goyal N, et al (2020) Bart: Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. In: Proceedings of the ACL, pp 7871–7880

  26. Li J, Katsis Y, Baldwin T, et al (2022) Spot: Knowledge-enhanced language representations for information extraction. In: Proceedings of the CIKM, pp 1124–1134

  27. Li R, Zhong J, Hu W et al (2024) Adaptive class augmented prototype network for few-shot relation extraction. Neural Netw 169:134–142

    Article  MATH  Google Scholar 

  28. Liu P, Yuan W, Fu J et al (2023) Pre-train, prompt, and predict: A systematic survey of prompting methods in natural language processing. ACM Comput Surv 55(9):1–35

    Article  MATH  Google Scholar 

  29. Lyu S, Chen H (2021) Relation classification with entity type restriction. In: Proceedings of the ACL, pp 390–395

  30. Van der Maaten L, Hinton G (2008) Visualizing data using t-sne. J Mach Learn Res 9(11)

  31. Milošević N, Thielemann W (2023) Comparison of biomedical relationship extraction methods and models for knowledge graph creation. J Web Semant 75:100756

    Article  MATH  Google Scholar 

  32. Nayak T, Ng HT (2020) Effective modeling of encoder-decoder architecture for joint entity and relation extraction. In: Proceedings of the AAAI, pp 8528–8535

  33. Nguyen TH, Plank B, Grishman R (2015) Semantic representations for domain adaptation: A case study on the tree kernel-based method for relation extraction. In: Proceedings of the ACL, pp 635–644

  34. Park S, Kim H (2020) Dual pointer network for fast extraction of multiple relations in a sentence. Appl Sci 10(11):3851

  35. Qin Y, Yang W, Wang K et al (2021) Entity relation extraction based on entity indicators. Symmetry 13(4):539

    Article  MATH  Google Scholar 

  36. Ranit Kumar Dey AKD (2023) Modified term frequency-inverse document frequency based deep hybrid framework for sentiment analysis. Multimed Tools Appl 82:32967–32990

    Article  Google Scholar 

  37. Ranit Kumar Dey AKD (2024) Neighbour adjusted dispersive flies optimization based deep hybrid sentiment analysis framework. Multimed Tools Appl 83:64393–64416

    Article  Google Scholar 

  38. Ren H, Cai Y, Chen X, et al (2020) A two-phase prototypical network model for incremental few-shot relation classification. In: Proceedings of the COLING, pp 1618–1629

  39. Rink B, Harabagiu S (2010) Utd: Classifying semantic relations by combining lexical and semantic resources. In: Proceedings of SemEval, pp 256–259

  40. Roth D, Yih Wt (2004) A linear programming formulation for global inference in natural language tasks. In: Proceedings of the CoNLL, pp 1–8

  41. Shin T, Razeghi Y, Logan IV RL, et al (2020) Autoprompt: Eliciting knowledge from language models with automatically generated prompts. In: Proceedings of the EMNLP, pp 4222–4235

  42. Snell J, Swersky K, Zemel R (2017) Prototypical networks for few-shot learning. Adv Neural Inf Process Syst 30

  43. Soares LB, FitzGerald N, Ling J, et al (2019) Matching the blanks: Distributional similarity for relation learning. In: Proceedings of the ACL, pp 2895–2905

  44. Sui D, Zeng X, Chen Y, et al (2023) Joint entity and relation extraction with set prediction networks. IEEE Trans Neural Netw Learn Syst

  45. Tang R, Chen Y, Qin Y et al (2022) Boundary assembling method for joint entity and relation extraction. Knowl Based Syst 250:109129

    Article  MATH  Google Scholar 

  46. Tian Y, Chen G, Song Y, et al (2021) Dependency-driven relation extraction with attentive graph convolutional networks. In: Proceedings of the ACL, pp 4458–4471

  47. Tourille J, Ferret O, Neveol A, et al (2017) Neural architecture for temporal relation extraction: A bi-lstm approach for detecting narrative containers. In: Proceedings of the ACL, pp 224–230

  48. Walker C, Strassel S, Medero J et al (2006) Ace 2005 multilingual training corpus. Linguistic Data Consortium 57:45

    Google Scholar 

  49. Wang K, Chen Y, Wen K, et al (2022) Cue prompt adapting model for relation extraction. Connection Science pp 1–18

  50. Wang L, Qu J, Xu T, et al (2023) Hybrid enhancement-based prototypical networks for few-shot relation classification. World Wide Web pp 1–20

  51. Wei Z, Zhang Y, Lian B, et al (2024) Joint data augmentation and knowledge distillation for few-shot continual relation extraction. Appl Intell 1–13

  52. Wen W, Liu Y, Ouyang C et al (2021) Enhanced prototypical network for few-shot relation extraction. Inf Process Manag 58(4):102596

    Article  MATH  Google Scholar 

  53. Wu S, He Y (2019) Enriching pre-trained language model with entity information for relation classification. In: Proceedings of the CIKM, pp 2361–2364

  54. Xu W, Chen K, Zhao T (2021) Document-level relation extraction with reconstruction. In: Proceedings of the AAAI, pp 14167–14175

  55. Xu Y, Mou L, Li G, et al (2015) Classifying relations via long short term memory networks along shortest dependency paths. In: Proceedings of the EMNLP, pp 1785–1794

  56. Chen Y, Zheng Q et al (2017) A set space model for feature calculus. IEEE Intell Syst 32(5):36–42

    Article  MATH  Google Scholar 

  57. Yu Y, He K, Li J (2021) Adversarial training for supervised relation extraction. Tsinghua Sci Technol 27(3):610–618

    Article  MATH  Google Scholar 

  58. Zeng D, Liu K, Lai S, et al (2014) Relation classification via convolutional deep neural network. In: Proceedings of the COLING, pp 2335–2344

  59. Zeng D, Liu K, Chen Y, et al (2015) Distant supervision for relation extraction via piecewise convolutional neural networks. In: Proceedings of the EMNLP, pp 1753–1762

  60. Zhao K, Xu H, Cheng Y et al (2021) Representation iterative fusion based on heterogeneous graph neural network for joint entity and relation extraction. Knowl Based Syst 219:106888

    Article  MATH  Google Scholar 

  61. Zhao T, Yan Z, Cao Y, et al (2021b) Asking effective and diverse questions: a machine reading comprehension based framework for joint entity-relation extraction. In: Proceedings of the IJCAI, pp 3948–3954

  62. Zheng S, Xu J, Zhou P et al (2016) A neural network framework for relation extraction: Learning entity semantic and relation pattern. Knowl Based Syst 114:12–23

    Article  MATH  Google Scholar 

  63. Zhong Z, Chen D (2021) A frustratingly easy approach for entity and relation extraction. In: Proceedings of the NAACL, pp 50–61

  64. Zhou G, Su J, Zhang J, et al (2005) Exploring various knowledge in relation extraction. In: Proceedings of the ACL, pp 427–434

Download references

Acknowledgements

We appreciate the efforts of the editors and reviewers in improving the quality of our submitted article. This work was supported by the National Key R&D Program of China, No. 2023YFC3304500, National Natural Science Foundation of China under Grant No. 62166007 and No. 62066008, the Key Technology R&D Program of Guizhou Province No.[2024]003.

Author information

Authors and Affiliations

Authors

Contributions

Kai Wang: Investigation, Conceptualization, Methodology, Formal Analysis, Validation, Visualization, and Original Draft. Yanpeng Chen: Review & Editing, Funding Acquisition. Ruizhang Huang: Review & Editing. Yongbin Qin: Funding Acquisition, Supervision, Project Administration.

Corresponding authors

Correspondence to Yanping Chen or Yongbin Qin.

Ethics declarations

Ethical and informed consent for data used

Data will be made available on resealable request.

Competing interests

The authors have no competing interests to declare that are relevant to the content of this article.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, K., Chen, Y., Huang, R. et al. A prototype evolution network for relation extraction. Appl Intell 55, 8 (2025). https://doi.org/10.1007/s10489-024-05864-6

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10489-024-05864-6

Keywords