Skip to main content
Log in

Piecewise convolutional neural networks with position attention and similar bag attention for distant supervision relation extraction

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

In relation to extraction tasks, distant supervision is a very effective method, which can automatically generate training data via aligning KBs and texts, thereby solving the problem of manually labeling data. However, distant supervision inevitably accompanies the wrong labeling problem. The paper presents a neural relation extraction method to deal with the problem of noisy words and poor feature information in the one-sentence bags generated by distant supervision. Previous studies mainly focus on sentence-level denoising and even bag-level denoising by designing neural networks. In the paper, we propose a piecewise convolutional neural network with position attention and similar bag attention for distant supervision relation extraction(PCNN-PATT-SBA). First, we propose a position attention based on Gaussian distribution, by modeling the position relationship between non-entity words and entity words to assign weights for the words of the sentence, which is expected to reduce the influence of those noisy words. In addition, we propose similar bag attention based on the feature similarity between different bags, which merges the features of similar bags to solve the problem of poor feature information in one-sentence bags. Experimental results on the New York Times dataset demonstrate the effectiveness of our proposed position attention and similar bag attention modules. Our method also achieves better relation extraction accuracy than state-of-the-art methods on this dataset. And compared to the bag-of-sentence attention model, the P value is increased by 6.9%, Compared with selective attention over instances (PCNN-ATT), an increase of 25.6%, compared to Instance-Level Adversarial Training (PCNN-HATT), an increase of 12.1%.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Zelenko D, Aone C, Richardella A (2003) Kernel methods for relation extraction. J Mach Learn Res 3(6):1083–1106

    MathSciNet  MATH  Google Scholar 

  2. Mooney RJ, Bunescu R. (2005) Subsequence kernels for relation extraction, pp 171–178

  3. Sadeghi F, Divvala SK, Farhadi A (2015) Viske: Visual knowledge extraction and question answering by visual verification of relation phrases, pp 1456–1464

  4. Ravichandran D, Hovy E (2002) Learning surface text patterns for a question answering system, pp 41–47

  5. Yan Y, Okazaki N, Matsuo Y, Yang Z, Ishizuka M (2009) Unsupervised relation extraction by mining wikipedia texts using information from the web, pp 1021–1029

  6. Zeng D, Liu K, Lai S, Zhou G, Zhao J (2014) Relation classification via convolutional deep neural network, pp 2335–2344

  7. Santos CND, Xiang B, Zhou B (2015) Classifying relations by ranking with convolutional neural networks 1:626–634

  8. Miwa M, Bansal M (2016) End-to-end relation extraction using lstms on sequences and tree structures 1:1105–1116

  9. Mintz M, Bills S, Snow R, Jurafsky D (2009) Distant supervision for relation extraction without labeled data, pp 1003–1011

  10. Bollacker K, Evans C, Paritosh P, Sturge T, Taylor J (2008) Freebase: a collaboratively created graph database for structuring human knowledge, pp 1247–1250

  11. Lin Y, Shen S, Liu Z, Luan H, Sun M (2016) Neural relation extraction with selective attention over instances 1:2124– 2133

  12. Riedel S, Yao L, Mccallum A (2010) Modeling relations and their mentions without labeled text, pp 148–163

  13. Hoffmann R, Zhang C, Ling X, Zettlemoyer L, Weld DS (2011) Knowledge-based weak supervision for information extraction of overlapping relations, pp 541–550

  14. Surdeanu M, Tibshirani J, Nallapati R, Manning CD (2012) Multi-instance multi-label learning for relation extraction, pp 455– 465

  15. Lecun Y, Bengio Y, Hinton GE (2015) Deep learning. Nature 521(7553):436–444

    Article  Google Scholar 

  16. Socher R, Huval B, Manning CD, Ng AY (2012) Semantic compositionality through recursive matrix-vector spaces, pp 1201–1211

  17. Zeng D, Liu K, Chen Y, Zhao J (2015) Distant supervision for relation extraction via piecewise convolutional neural networks, pp 1753–1762

  18. Ji G, Liu K, He S, Zhao J (2017) Distant supervision for relation extraction with sentence-level attention and entity descriptions, pp 3060–3066

  19. Liu T, Wang K, Chang B, Sui Z (2017) A soft-label method for noise-tolerant distantly supervised relation extraction, pp 1790–1795

  20. Yuan Y, Liu L, Tang S, Zhang Z, Zhuang Y, Pu S, Wu F, Ren X (2019) Cross-relation cross-bag attention for distantly-supervised relation extraction 33(01):419–426

  21. Zhang Y, Zhong V, Chen D, Angeli G, Manning CD (2017) Position-aware attention and supervised data improve slot filling, pp 35–45

  22. Li Y, Long G, Shen T, Zhou T, Yao L, Huo H, Jiang J Self-attention enhanced selective gate with entity-aware embedding for distantly supervised relation extraction, arXiv:Computation and Language

  23. Ye Z, Ling Z (2019) Distant supervision relation extraction with intra-bag and inter-bag attentions, pp 2810–2819

  24. Vilnis L, Mccallum A Word representations via gaussian embedding, arXiv: Computation and Language

  25. Du J, Han J, Way A, Wan D (2018) Multi-level structured self-attentions for distantly supervised relation extraction, pp 2216–2225

  26. Zhou P, Xu J, Qi Z, Bao H, Chen Z, Xu B (2018) Distant supervision for relation extraction with hierarchical selective attention, vol 108

Download references

Acknowledgements

This work is supported by the National Natural Science Foundation of China (62066022).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Weijiang Li.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, W., Wang, Q., Wu, J. et al. Piecewise convolutional neural networks with position attention and similar bag attention for distant supervision relation extraction. Appl Intell 52, 4599–4609 (2022). https://doi.org/10.1007/s10489-021-02632-8

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-021-02632-8

Keywords

Navigation