Abstract
In the research field of relation extraction, a large amount of labeled training data can be quickly obtained through distant supervision. However, distant supervision will be inevitably accompanied by the problem of wrong labelling, and the noisy data can substantially hurt the performance of relation extraction model. Previous studies mainly focus on de-noising by designing neural networks with intra-bag or inter-bag attention, but do not sufficiently take noise interference at different levels into account. Moreover, the conventional approach Piecewise Convolutional Neural Network (PCNN) encodes sentences based on an assumption that each segmented feature contributes equally to the relation, which is unreasonable. To alleviate these issues, we propose a distant supervision relation extraction model with Improved PCNN (I-PCNN) and multi-level attention. By incorporating word-level, sentence-level and bag-level attention into the model, it can effectively reduce the noisy data in a data set. Besides, it also enhances PCNN by augmenting self-attention, which can promote the encoding quality of sentences. Experiments conducted on New York Times (NYT) data set show that the proposed model evidently outperforms several baseline models in terms of the metrics \(Precision\), \(Recall\), \(AUC\), and \(P@N\).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Mintz, M., Bills, S., Snow, R., et al.: Distant supervision for relation extraction without labeled data. In: Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL, pp. 1003–1011 (2009)
Bollacker, K., Evans, C., Paritosh, P., et al.: Freebase: a collaboratively created graph database for structuring human knowledge. In: Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data, pp. 1247–1250 (2008)
Zeng, D., Liu, K., Chen, Y., et al.: Distant supervision for relation extraction via piecewise convolutional neural networks. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1753–1762 (2015)
Lin, Y., Shen, S., Liu, Z., et al.: Neural relation extraction with selective attention over instances. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 2124–2133 (2016)
Jat, S., Khandelwal, S., Talukdar, P.: Improving distantly supervised relation extraction using word and entity based attention. In: Proceedings of the 6th Workshop on Automated Knowledge Base Construction (AKBC) at NIPS 2017, Long Beach, CA, USA (2017)
Ye, Z.X., Ling, Z.H.: Distant supervision relation extraction with intra-bag and inter-bag attentions. In: Proceedings of the North American Chapter of the Association for Computational Linguistics. Minneapolis, USA, pp. 2810–2819 (2019)
Riedel, S., Yao, L., McCallum, A.: Modeling relations and their mentions without labeled text. In: Balcázar, J.L., Bonchi, F., Gionis, A., Sebag, M. (eds.) ECML PKDD 2010. LNCS (LNAI), vol. 6323, pp. 148–163. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15939-8_10
Xie, C., Liang, J., Liu, J., et al.: Revisiting the negative data of distantly supervised relation extraction. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Association for Computational Linguistics, pp. 3572–3581 (2021)
Li, S.Q., Zhu, Q., Chen, Y.F., et al.: A distant supervision relation extraction method incorporating multi-headed self-attention. Inf. Eng. 7(6), 045–057 (2022)
Ji, Y.M., Tang, S.N., Liu, Z.D., et al.: Distant supervision relation extraction algorithm based on TransH with double attention mechanism. J. Nanjing Univ. Posts Telecommun. Nat. Sci. Ed. 42(6), 9 (2022)
Mikolov, T., Sutskever, I., Chen, K., et al.: Distributed representations of words and phrases and their compositionality. In: Advances in Neural Information Processing Systems, vol. 26 (2013)
Gu, J., Wang, Z., Kuen, J., et al.: Recent advances in convolutional neural networks. Pattern Recogn. 77, 354–377 (2018)
Vaswani, A., Shazeer, N., Parmar, N., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)
Li, Y., Hao, Z., Lei, H.: Survey of convolutional neural network. J. Comput. Appl. 36(9), 2508 (2016)
Jiang, X., Qian, X.Z., Song, W.: Combining residual BiLSTM with sentence and bag attention for distant supervision relation extraction. Comput. Eng. 10, 110–122 (2022)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Zou, Y., Wang, Q., Wang, Z., Zhou, J., Zeng, X. (2023). Distant Supervision Relation Extraction with Improved PCNN and Multi-level Attention. In: Jin, Z., Jiang, Y., Buchmann, R.A., Bi, Y., Ghiran, AM., Ma, W. (eds) Knowledge Science, Engineering and Management. KSEM 2023. Lecture Notes in Computer Science(), vol 14117. Springer, Cham. https://doi.org/10.1007/978-3-031-40283-8_27
Download citation
DOI: https://doi.org/10.1007/978-3-031-40283-8_27
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-40282-1
Online ISBN: 978-3-031-40283-8
eBook Packages: Computer ScienceComputer Science (R0)