Abstract
Among many natural language processing tasks, the task of relation classification is an important basic work. Relational classification is required to provide high-quality corpus in fields such as machine translation, structured data generation, knowledge graphs, and semantic question answering. Existing relational classification models include models based on traditional machine learning, models based on deep learning, and models based on attention mechanisms. The above-mentioned models all have the disadvantage of only using a single feature for relationship classification, and do not effectively combine entity features and context features. Based on the above problems, this study proposes a relational classification network that integrates multi-scale semantic features. Through the combination of entity features and context semantic features, the model can better learn the relationship features between entities. In addition, in order to verify the validity of the model and the effectiveness of each module, this study conducted experiments on the SemEval-2010 Task 8 and KBP37 data sets. Experimental results demonstrate that the model performance is higher than most existing models.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Kužina, V., Petric, A.M., Barišić, M., et al.: CASSED: context-based approach for structured sensitive data detection. Expert Syst. Appl. 223, 119924 (2023)
Ranathunga, S., Lee, E.S.A., Prifti Skenduli, M., et al.: Neural machine translation for low-resource languages: a survey. ACM Comput. Surv. 55(11), 1–37 (2023)
Ryen, V., Soylu, A., Roman, D.: Building semantic knowledge graphs from (semi-) structured data: a review. Future Internet 14(5), 129 (2022)
Bin, H., Yue, K., Linhao, F.: Knowledge modeling and association Q &A for policy texts. Data Anal. Knowl. Discov. 6(11), 79–92 (2023)
Wang, H., Qin, K., Zakari, R.Y., et al.: Deep neural network-based relation extraction: an overview. Neural Comput. Appl. 1–21 (2022)
Mikolov, T., Sutskever, I., Chen, K., et al.: Distributed representations of words and phrases and their compositionality. IN: Advances in Neural Information Processing Systems, vol. 26 (2013)
Peters, M.E., Neumann, M., Iyyer, M., et al.: Deep contextualized word representations. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 2227–2237 (2018)
Radford, A., Narasimhan, K., Salimans, T., et al.: Improving language understanding by generative pre-training (2018)
Devlin, J., Chang, M.W., Lee, K., et al.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
Zhang, Z., Han, X., Liu, Z., et al.: ERNIE: enhanced language representation with informative entities. arXiv preprint arXiv:1905.07129 (2019)
Sun, Y., Wang, S., Li, Y., et al.: Ernie: enhanced representation through knowledge integration. arXiv preprint arXiv:1904.09223 (2019)
Sun, Y., Wang, S., Li, Y., et al.: Ernie 2.0: a continual pre-training framework for language understanding. In: Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 8968–8975 (2020)
Cui, Y., Che, W., Liu, T., et al.: Pre-training with whole word masking for Chinese BERT. IEEE/ACM Trans. Audio Speech Lang. Process. 29, 3504–3514 (2021)
Liu, Y., Ott, M., Goyal, N., et al.: Roberta: a robustly optimized BERT pretraining approach. arXiv preprint arXiv:1907.11692 (2019)
Wei, J., Ren, X., Li, X., et al.: Nezha: neural contextualized representation for Chinese language understanding. arXiv preprint arXiv:1909.00204 (2019)
Peters, M.E., Neumann, M., Logan IV, R.L., et al.: Knowledge enhanced contextual word representations. arXiv preprint arXiv:1909.04164 (2019)
Acknowledgements
This work was supported by National Key Research and Development Program of China (2022YFB4004401), and the Taishan Scholars Program (NO. tsqn202103097).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Li, G., Tian, J., Zhou, M., Li, M., Han, D. (2023). A Relational Classification Network Integrating Multi-scale Semantic Features. In: Liu, F., Duan, N., Xu, Q., Hong, Y. (eds) Natural Language Processing and Chinese Computing. NLPCC 2023. Lecture Notes in Computer Science(), vol 14303. Springer, Cham. https://doi.org/10.1007/978-3-031-44696-2_11
Download citation
DOI: https://doi.org/10.1007/978-3-031-44696-2_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-44695-5
Online ISBN: 978-3-031-44696-2
eBook Packages: Computer ScienceComputer Science (R0)