Abstract
Entity relationship extraction is the main task in information extraction, and its purpose is to extract triples <entity e1, relationship r, entity e2> from unstructured text. The current relationship extraction model is mainly based on the BiLSTM neural network, and most of the introduced are sentence-level attention mechanisms. The structural parameters of this model are complex, which easily leads to over-fitting problems, and lacks the acquisition of word-level information within the sentence. In response to these problems, we propose a model based on the multi-attention mechanism and BiGRU network. The model mainly uses BiGRU as the main coding structure. By reducing the parameter settings, the training efficiency can be effectively improved. At the same time, a multi-attention mechanism is introduced to learn the influence of different features on relationship classification from the two dimensions of word level and sentence level, and to improve the effect of relationship extraction through different weight settings. The model is tested on the SemVal 2010 task8 dataset. The experiment shows that our model is significantly better than the baseline method.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Shen, Y., He, X., Gao, J., et al.: A latent semantic model with convolutional-pooling structure for information retrieval. In: Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management, pp. 101–110 (2014)
Liu, C.Y., Sun, W.B., Chao, W.H., Che, W.: Convolution neural network for relation extraction. In: Motoda, H., Wu, Z., Cao, L., Zaiane, O., Yao, M., Wang, W. (eds.) ADMA 2013. LNCS (LNAI), vol. 8347, pp. 231–242. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-53917-6_21
Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: Proceedings of COLING 2014, the 25th International Conference on Computational Linguistics: Technical Papers, pp. 2335–2344 (2014)
Santos, D., Bing, X., Zhou, B.: Classifying relations by ranking with convolutional neural networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing: Long Papers, vol. 1 (2015)
Miwa, M., Bansal, M.: End-to-end relation extraction using LSTMs on sequences and tree structures. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics: Long Papers, vol. 1 (2016)
Zhang, D., Wang, D.J.: Relation classification via recurrent neural network. arXiv preprint arXiv:1508.01006 (2015)
Zhang, S., Zheng, D., Hu, X., Yang, M.: Bidirectional long short-term memory networks for relation classification. In: Proceedings of the 29th Pacific Asia Conference on Language, Information and Computation, pp. 73–78 (2015)
Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. Comput. Sci. (2014)
Katiyar, A., Cardie, C.: Going out on a limb: joint extraction of entity mentions and relations without dependency trees. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics: Long Papers, vol. 1, pp. 917–928 (2017)
Zhou, P., et al.: Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics: Short Papers, vol. 2, pp. 207–212 (2016)
Vaswani, A., Shazeer, N., Parmar, N., et al.: Attention is all you need. arXiv (2017). GB/T 7714
Cho, K., Van Merrienboer, B., Gulcehre, C., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. Comput. Sci. (2014)
Lin, Y., Shen, S., Liu, Z., et al.: Neural relation extraction with selective attention over instances. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics: Long Papers, vol. 1 (2016)
Hendrickx, I., Kim, S.N., Kozareva, Z., et al.: SemEval-2010 task 8: multi-way classification of semantic relations between pairs of nominals. In: Proceedings of the Workshop on Semantic Evaluations: Recent Achievements and Future Directions, pp. 94–99 (2009)
Zeyuan, C., Pan, L., Liu, S.: Hybrid BiLSTM-siamese network for relation extraction. In: Proceedings of the 18th International Conference on Autonomous Agents and MultiAgent Systems, pp. 1907–1909 (2019)
Acknowledgments
This research is supported by National Key Research and Development Program of China under grant number 2017YFC1405404, and Green Industry Technology Leading Project (product development category) of Hubei University of Technology under grant number CPYF2017008.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Wang, L., Xiong, C., Xu, W., Lin, S. (2021). Entity Relation Extraction Based on Multi-attention Mechanism and BiGRU Network. In: Barolli, L., Yim, K., Enokido, T. (eds) Complex, Intelligent and Software Intensive Systems. CISIS 2021. Lecture Notes in Networks and Systems, vol 278. Springer, Cham. https://doi.org/10.1007/978-3-030-79725-6_5
Download citation
DOI: https://doi.org/10.1007/978-3-030-79725-6_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-79724-9
Online ISBN: 978-3-030-79725-6
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)