Abstract
Distantly-supervised relation extraction has proven to be effective to find relational facts from texts. However, the existing approaches treat the instances in the same bag independently and ignore the semantic structural information. In this paper, we propose a graph convolution network (GCN) model with an attention mechanism to improve relation extraction. For each bag, the model first builds a graph through the dependency tree of each instance in this bag. In this way, the correlations between instances are built through their common words. The learned node (word) embeddings which encode the bag information are then fed into the sentence encoder, i.e., text CNN to obtain better representations of sentences. Besides, an instance-level attention mechanism is introduced to select valid instances and learn the textual relation embedding. Finally, the learned embedding is used to train our relation classifier. Experiments on two benchmark datasets demonstrate that our model significantly outperforms the compared baselines.
Keywords
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Bastings, J., Titov, I., Aziz, W., Marcheggiani, D., Simaan, K.: Graph convolutional encoders for syntax-aware neural machine translation. arXiv preprint arXiv:1704.04675 (2017)
Bruna, J., Zaremba, W., Szlam, A., Lecun, Y.: Spectral networks and locally connected networks on graphs. In: International Conference on Learning Representations (ICLR2014), CBLS, April 2014 (2014)
Defferrard, M., Bresson, X., Vandergheynst, P.: Convolutional neural networks on graphs with fast localized spectral filtering. In: Advances in Neural Information Processing Systems, pp. 3844–3852 (2016)
Du, J., Han, J., Way, A., Wan, D.: Multi-level structured self-attentions for distantly supervised relation extraction. In: EMNLP, pp. 2216–2225 (2018)
Han, X., Liu, Z., Sun, M.: Neural knowledge acquisition via mutual attention between knowledge graph and text. In: AAAI, pp. 4832–4839 (2018)
Han, X., Yu, P., Liu, Z., Sun, M., Li, P.: Hierarchical relation extraction with coarse-to-fine grained attention. In: EMNLP, pp. 2236–2245 (2018)
He, Z., Chen, W., Li, Z., Zhang, M., Zhang, W., Zhang, M.: See: syntax-aware entity embedding for neural relation extraction. In: Thirty-Second AAAI Conference on Artificial Intelligence (2018)
Henaff, M., Bruna, J., LeCun, Y.: Deep convolutional networks on graph-structured data. arXiv preprint arXiv:1506.05163 (2015)
Hoffmann, R., Zhang, C., Ling, X., Zettlemoyer, L., Weld, D.S.: Knowledge-based weak supervision for information extraction of overlapping relations. In: ACL, pp. 541–550 (2011)
Jat, S., Khandelwal, S., Talukdar, P.: Improving distantly supervised relation extraction using word and entity based attention. arXiv preprint arXiv:1804.06987 (2018)
Ji, G., Liu, K., He, S., Zhao, J., et al.: Distant supervision for relation extraction with sentence-level attention and entity descriptions. In: AAAI, pp. 3060–3066 (2017)
Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)
Lin, Y., Shen, S., Liu, Z., Luan, H., Sun, M.: Neural relation extraction with selective attention over instances. In: ACL, vol. 1, pp. 2124–2133 (2016)
Mintz, M., Bills, S., Snow, R., Jurafsky, D.: Distant supervision for relation extraction without labeled data. In: ACL/IJCNLP, pp. 1003–1011 (2009)
Riedel, S., Yao, L., McCallum, A.: Modeling relations and their mentions without labeled text. In: ECML/PKDD, pp. 148–163 (2010)
Surdeanu, M., Tibshirani, J., Nallapati, R., Manning, C.D.: Multi-instance multi-label learning for relation extraction. In: EMNLP-CoNLL, pp. 455–465 (2012)
Vashishth, S., Joshi, R., Prayaga, S.S., Bhattacharyya, C., Talukdar, P.: Reside: improving distantly-supervised neural relation extraction using side information. In: EMNLP, pp. 1257–1266 (2018)
Yao, L., Mao, C., Luo, Y.: Graph convolutional networks for text classification (2018)
Yuan, C., Huang, H., Feng, C., Liu, X., Wei, X.: Distant supervision for relation extraction with linear attenuation simulation and non-IID relevance embedding. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, pp. 7418–7425 (2019)
Yuan, Y., et al.: Cross-relation cross-bag attention for distantly-supervised relation extraction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, pp. 419–426 (2019)
Zeng, D., Liu, K., Chen, Y., Zhao, J.: Distant supervision for relation extraction via piecewise convolutional neural networks. In: EMNLP, pp. 1753–1762 (2015)
Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: COLING, pp. 2335–2344 (2014)
Zeng, W., Lin, Y., Liu, Z., Sun, M.: Incorporating relation paths in neural relation extraction. In: EMNLP, pp. 1768–1777 (2017)
Acknowledgement
This work is supported by the National Natural Science Foundation of China (No. 61772082, 61806020, 61702296, 61972047), the National Key Research and Development Program of China (2017YFB0803304), the Beijing Municipal Natural Science Foundation (4182043), the CCF-Tencent Open Fund, and the Fundamental Research Funds for the Central Universities.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Zhang, L., Hu, L., Shi, C. (2020). Incorporating Instance Correlations in Distantly Supervised Relation Extraction. In: Wang, X., Lisi, F., Xiao, G., Botoeva, E. (eds) Semantic Technology. JIST 2019. Lecture Notes in Computer Science(), vol 12032. Springer, Cham. https://doi.org/10.1007/978-3-030-41407-8_12
Download citation
DOI: https://doi.org/10.1007/978-3-030-41407-8_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-41406-1
Online ISBN: 978-3-030-41407-8
eBook Packages: Computer ScienceComputer Science (R0)