Skip to main content

Distant Supervision Relation Extraction with Improved PCNN and Multi-level Attention

  • Conference paper
  • First Online:
Knowledge Science, Engineering and Management (KSEM 2023)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 14117))

  • 508 Accesses

Abstract

In the research field of relation extraction, a large amount of labeled training data can be quickly obtained through distant supervision. However, distant supervision will be inevitably accompanied by the problem of wrong labelling, and the noisy data can substantially hurt the performance of relation extraction model. Previous studies mainly focus on de-noising by designing neural networks with intra-bag or inter-bag attention, but do not sufficiently take noise interference at different levels into account. Moreover, the conventional approach Piecewise Convolutional Neural Network (PCNN) encodes sentences based on an assumption that each segmented feature contributes equally to the relation, which is unreasonable. To alleviate these issues, we propose a distant supervision relation extraction model with Improved PCNN (I-PCNN) and multi-level attention. By incorporating word-level, sentence-level and bag-level attention into the model, it can effectively reduce the noisy data in a data set. Besides, it also enhances PCNN by augmenting self-attention, which can promote the encoding quality of sentences. Experiments conducted on New York Times (NYT) data set show that the proposed model evidently outperforms several baseline models in terms of the metrics \(Precision\), \(Recall\), \(AUC\), and \(P@N\).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Mintz, M., Bills, S., Snow, R., et al.: Distant supervision for relation extraction without labeled data. In: Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL, pp. 1003–1011 (2009)

    Google Scholar 

  2. Bollacker, K., Evans, C., Paritosh, P., et al.: Freebase: a collaboratively created graph database for structuring human knowledge. In: Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data, pp. 1247–1250 (2008)

    Google Scholar 

  3. Zeng, D., Liu, K., Chen, Y., et al.: Distant supervision for relation extraction via piecewise convolutional neural networks. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1753–1762 (2015)

    Google Scholar 

  4. Lin, Y., Shen, S., Liu, Z., et al.: Neural relation extraction with selective attention over instances. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 2124–2133 (2016)

    Google Scholar 

  5. Jat, S., Khandelwal, S., Talukdar, P.: Improving distantly supervised relation extraction using word and entity based attention. In: Proceedings of the 6th Workshop on Automated Knowledge Base Construction (AKBC) at NIPS 2017, Long Beach, CA, USA (2017)

    Google Scholar 

  6. Ye, Z.X., Ling, Z.H.: Distant supervision relation extraction with intra-bag and inter-bag attentions. In: Proceedings of the North American Chapter of the Association for Computational Linguistics. Minneapolis, USA, pp. 2810–2819 (2019)

    Google Scholar 

  7. Riedel, S., Yao, L., McCallum, A.: Modeling relations and their mentions without labeled text. In: Balcázar, J.L., Bonchi, F., Gionis, A., Sebag, M. (eds.) ECML PKDD 2010. LNCS (LNAI), vol. 6323, pp. 148–163. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15939-8_10

    Chapter  Google Scholar 

  8. Xie, C., Liang, J., Liu, J., et al.: Revisiting the negative data of distantly supervised relation extraction. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Association for Computational Linguistics, pp. 3572–3581 (2021)

    Google Scholar 

  9. Li, S.Q., Zhu, Q., Chen, Y.F., et al.: A distant supervision relation extraction method incorporating multi-headed self-attention. Inf. Eng. 7(6), 045–057 (2022)

    Google Scholar 

  10. Ji, Y.M., Tang, S.N., Liu, Z.D., et al.: Distant supervision relation extraction algorithm based on TransH with double attention mechanism. J. Nanjing Univ. Posts Telecommun. Nat. Sci. Ed. 42(6), 9 (2022)

    Google Scholar 

  11. Mikolov, T., Sutskever, I., Chen, K., et al.: Distributed representations of words and phrases and their compositionality. In: Advances in Neural Information Processing Systems, vol. 26 (2013)

    Google Scholar 

  12. Gu, J., Wang, Z., Kuen, J., et al.: Recent advances in convolutional neural networks. Pattern Recogn. 77, 354–377 (2018)

    Article  Google Scholar 

  13. Vaswani, A., Shazeer, N., Parmar, N., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)

    Google Scholar 

  14. Li, Y., Hao, Z., Lei, H.: Survey of convolutional neural network. J. Comput. Appl. 36(9), 2508 (2016)

    Google Scholar 

  15. Jiang, X., Qian, X.Z., Song, W.: Combining residual BiLSTM with sentence and bag attention for distant supervision relation extraction. Comput. Eng. 10, 110–122 (2022)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yang Zou .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zou, Y., Wang, Q., Wang, Z., Zhou, J., Zeng, X. (2023). Distant Supervision Relation Extraction with Improved PCNN and Multi-level Attention. In: Jin, Z., Jiang, Y., Buchmann, R.A., Bi, Y., Ghiran, AM., Ma, W. (eds) Knowledge Science, Engineering and Management. KSEM 2023. Lecture Notes in Computer Science(), vol 14117. Springer, Cham. https://doi.org/10.1007/978-3-031-40283-8_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-40283-8_27

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-40282-1

  • Online ISBN: 978-3-031-40283-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics