Abstract
Domain-specific relation extraction plays an important role in constructing domain knowledge graph and further analysis. In the field of military intelligence relation extraction, there are challenges such as relation overlapping and exposure bias. Therefore, on the basis of a combination of domain N-gram adapter and axial attention, this paper presents a single-step joint relation extraction model for the field of military text analysis. Considering domain-specific language structures and patterns, the domain-specific N-gram adapter is incorporated into the pre-trained language model to improve the encoding of the proposed model. Furthermore, the axial attention mechanism is applied to capture the dependencies between token pairs and their contexts, so as to enhance the encoding representation ability of the proposed model. After that, entities and relations are jointly extracted by a relation-specific decoding method. The effectiveness of the proposed model is demonstrated through experiments on a military relation extraction dataset with F1-Score 0.6690 and CMeIE with F1-Score 0.6051, which is better than existing joint relation extraction models.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Zheng, S., Wang, F., Bao, H., et al.: Joint extraction of entities and relations based on a novel tagging scheme. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, vol. 1 (2017)
Li, Q., Ji, H.: Incremental joint extraction of entity mentions and relations. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, vol. 1, pp. 402–412 (2014)
Wang, Y., Yu, B., Zhang, Y., et al.: TPLinker: single-stage joint extraction of entities and relations through token pair linking. In: Proceedings of the 28th International Conference on Computational Linguistics, pp. 1572–1582 (2020)
Zhou, G., Su, J., Zhang, J., Zhang, M.: Exploring various knowledge in relation extraction. In: Proceedings of the 43rd Annual Meeting of the Association for Computational Linguistics, pp. 427–434 (2005)
Chan, Y., Roth, D.: Exploiting syntactico-semantic structures for relation extraction. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, pp. 551–560 (2011)
Zhong, Z., Chen, D.: A frustratingly easy approach for entity and relation extraction. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 50–61 (2021)
Miwa, M., Bansal, M.: End-to-end relation extraction using LSTMs on sequences and tree structures. In: Proceedings Of the 54th Annual Meeting of the Association for Computational Linguistics, vol. 1 (2016)
Wei, Z., Su, J., Wang, Y., et al.: A novel cascade binary tagging framework for relational triple extraction. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 1476–1488 (2020)
Zhang, J., Liu, M., Xu, L.: Multiple-granularity graph for document-level relation extraction. In: Zhao, X., Yang, S., Wang, X., Li, J. (eds.) WISA 2022. LNCS, vol. 13579, pp. 126–134. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-20309-1_11
Kenton, J., Toutanova, L.: Bert: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of NaacL-HLT, vol. 1, p. 2 (2019)
Brown, T., Mann, B., Ryder, N., et al.: Language models are few-shot learners. Adv. Neural. Inf. Process. Syst. 33, 1877–1901 (2020)
Beltagy, I., Lo, K., Cohan, A.: SciBERT: a pretrained language model for scientific text. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, pp. 3615–3620 (2019)
Diao, S., Xu, R., Su, H., et al.: Taming pre-trained language models with n-gram representations for low-resource domain adaptation. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, vol. 1, pp. 3336–3349 (2021)
Liu, Y., Ott, M., Goyal, N., et al.: A robustly optimized bert pretraining approach. ArXiv Preprint ArXiv:1907.11692 (2019)
Shang, Y., Huang, H., Mao, X.: OneRel: joint entity and relation extraction with one module in one step. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 11285–11293 (2022)
Wang, H., Zhu, Y., Green, B., Adam, H., Yuille, A., Chen, L.-C.: Axial-DeepLab: stand-alone axial-attention for panoptic segmentation. In: Vedaldi, A., Bischof, H., Brox, T., Frahm, J.-M. (eds.) ECCV 2020, Part IV. LNCS, vol. 12349, pp. 108–126. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58548-8_7
Shaw, P., Uszkoreit, J., Vaswani, A.: Self-attention with relative position representations. In: Proceedings of NaacL-HLT, pp. 464–468 (2018)
Su, J.: GPLinker: joint entity relation extraction based on GlobalPointer (2022). https://kexue.fm/archives/8888
Guan, T., Zan, H., Zhou, X., Xu, H., Zhang, K.: CMeIE: construction and evaluation of Chinese medical information extraction dataset. In: Zhu, X., Zhang, M., Hong, Yu., He, R. (eds.) NLPCC 2020, Part I. LNCS (LNAI), vol. 12430, pp. 270–282. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-60450-9_22
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Yang, Z., Li, Z., Xu, Z., Gan, Z., Cao, W. (2023). A Joint Relation Extraction Model Based on Domain N-Gram Adapter and Axial Attention for Military Domain. In: Yuan, L., Yang, S., Li, R., Kanoulas, E., Zhao, X. (eds) Web Information Systems and Applications. WISA 2023. Lecture Notes in Computer Science, vol 14094. Springer, Singapore. https://doi.org/10.1007/978-981-99-6222-8_20
Download citation
DOI: https://doi.org/10.1007/978-981-99-6222-8_20
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-6221-1
Online ISBN: 978-981-99-6222-8
eBook Packages: Computer ScienceComputer Science (R0)