Abstract
Relation extraction (RE) is an important part of knowledge graph construction. The span-based entity-relation joint extraction model is an emerging model for Relation extraction. In the span-based entity-relation joint extraction model, the method of generating span representation vectors is usually relatively simple, and the semantic representation ability is insufficient. This paper studies the impact of four different span vector representation methods on the performance of the entity-relation joint extraction model, and enriches the features of span representation vectors by combining multiple span semantic representation methods. Compared with the baseline model, the combined span representation method can effectively improve the performance of the model on the CoNLL04 data set. Named entity recognition has achieved an F1 score of 89.37%, and relation extraction has achieved an F1 score of 72.64%. Compared with the baseline model, it has increased by 0.43% and 1.17% respectively.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Chi, R., Wu, B., Hu, L., Zhang, Y.: Enhancing joint entity and relation extraction with language modeling and hierarchical attention. In: Shao, J., Yiu, M.L., Toyoda, M., Zhang, D., Wang, W., Cui, B. (eds.) APWeb-WAIM 2019. LNCS, vol. 11641, pp. 314–328. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-26072-9_24
Devlin, J., Chang, M.W., Lee, K., Toutanova, K.N.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186 (2018)
Dixit, K., Al-Onaizan, Y.: Span-level model for relation extraction. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 5308–5314 (2019)
Eberts, M., Ulges, A.: Span-based joint entity and relation extraction with trans- former pre-training. In: ECAI, pp. 2006–2013 (2020)
Gupta, P., Schütze, H., Andrassy, B.: Table filling multi-task recurrent neural network for joint entity and relation extraction. In: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pp. 2537–2547 (2016)
Li, X., et al.: Entity-relation extraction as multi-turn question answering. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 1340–1350 (2019)
Li, Y., et al.: Self-attention enhanced selective gate with entity-aware embedding for distantly supervised relation extraction. Proc. AAAI Conf. Artificial Intell. 34, 8269–8276 (2020)
Luan, Y., He, L., Ostendorf, M., Hajishirzi, H.: Multi-task identification of entities, relations, and co-reference for scientific knowledge graph construction. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 3219–3232 (2018)
Luan, Y., Wadden, D., He, L., Shah, A., Ostendorf, M., Hajishirzi, H.: A general framework for information extraction using dynamic span graphs. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 3036–3046 (2019)
1Miwa, M., Sasaki, Y.: Modeling joint entity and relation extraction with table representation. In: Conference on Empirical Methods in Natural Language Processing (2014)
Miwa, M., Bansal, M.: End-to-end relation extraction using lstms on sequences and tree structures. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, pp. 1105–1116 (2016)
Miwa, M., Sasaki, Y.: Modeling joint entity and relation extraction with table representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1858–1869 (2014)
Nayak, T., Ng, H.T.: Effective attention modeling for neural relation extraction. In: Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pp. 603–612 (2019)
Nguyen, D.Q., Verspoor, K.: End-to-end neural relation extraction using deep biaffine attention. In: Azzopardi, L., Stein, B., Fuhr, N., Mayr, P., Hauff, C., Hiemstra, D. (eds.) ECIR 2019. LNCS, vol. 11437, pp. 729–738. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-15712-8_47
Shen, Y., Huang, X.: Attention-based convolutional neural network for semantic relation extraction. In: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pp. 2526–2536 (2016)
Tran, T., Kavuluru, R.: Neural metric learning for fast end-to-end relation extraction (2019)
Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st International Conference on Neural Information Processing Systems, vol. 30, pp. 5998–6008 (2017)
Wang, L., Cao, Z., de Melo, G., Liu, Z.: Relation classification via multi-level attention cnns. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, pp. 1298–1307 (2016)
Yu, B., Zhang, Z., Liu, T., Wang, B., Li, S., Li, Q.: Beyond word attention: Using segment attention in neural relation extraction. In: Proceedings of the Twenty- Eighth International Joint Conference on Artificial Intelligence, pp. 5401–5407 (2019)
Zeng, D., Liu, K., Chen, Y., Zhao, J.: Distant supervision for relation extraction via piece-wise convolutional neural networks. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1753–1762 (2015)
Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: Proceedings of COLING 2014, the 25th International Conference on Computational Linguistics: Technical Papers, pp. 2335–2344 (2014)
Zhang, M., Yue, Z., Fu, G.: End-to-end neural relation extraction with global optimization. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing (2017)
Zhang, Y., Zhong, V., Chen, D., Angeli, G., Manning, C.D.: Position-aware attention and supervised data improve slot filling. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 35–45 (2017)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Tang, Y., Yu, J., Li, S., ji, B., Tan, Y., Wu, Q. (2021). Span Representation Generation Method in Entity-Relation Joint Extraction. In: Huang, DS., Jo, KH., Li, J., Gribova, V., Hussain, A. (eds) Intelligent Computing Theories and Application. ICIC 2021. Lecture Notes in Computer Science(), vol 12837. Springer, Cham. https://doi.org/10.1007/978-3-030-84529-2_39
Download citation
DOI: https://doi.org/10.1007/978-3-030-84529-2_39
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-84528-5
Online ISBN: 978-3-030-84529-2
eBook Packages: Computer ScienceComputer Science (R0)