Abstract
Medical named entity recognition can assist doctors to quickly identifying key content and improving clinical work efficiency. Chinese named entity recognition methods based on pre-trained language models have achieved remarkable performance. However, most of these models have the following problems for medical named entity recognition: these models are designed for flat named entity recognition tasks but not for nested entities. Furthermore, the medical entities are hard to be recognized due to the lack of medical domain knowledge. To tackle these problems, we propose a Chinese medical nested named entity recognition model based on feature fusion and a bidirectional lattice embedding graph. The problem of poor recognition of medical entities due to the lack of medical domain knowledge is solved by introducing a medical lexicon. The problem of Chinese polyphonic characters with different meanings in the same form is solved by introducing pinyin information. The model considers the similarity between different entity types to improve the model’s effectiveness. The results on a Chinese medical nested named entity dataset CBLUE-CMeEE demonstrate the outperform performance and effectiveness of the model.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Brody, S., Alon, U., Yahav, E.: How attentive are graph attention networks? In: International Conference on Learning Representations (2022). https://openreview.net/forum?id=F72ximsx7C1
Cui, Y., Che, W., Liu, T., Qin, B., Wang, S., Hu, G.: Revisiting pre-trained models for Chinese natural language processing. In: Findings of the Association for Computational Linguistics, EMNLP 2020, pp. 657–668, November 2020
Cui, Y., Che, W., Liu, T., Qin, B., Yang, Z.: Pre-training with whole word masking for Chinese bert. IEEE/ACM Trans. Audio Speech Lang. Process. 29, 3504–3514 (2021)
Diao, S., Bai, J., Song, Y., Zhang, T., Wang, Y.: Zen: Pre-training chinese text encoder enhanced by n-gram representations. In: Findings of the Association for Computational Linguistics: EMNLP 2020. pp. 4729–4740 (2020)
Gui, T., Ma, R., Zhang, Q., Zhao, L., Jiang, Y.G., Huang, X.: Cnn-based chinese ner with lexicon rethinking. In: the 28th International Joint Conference on Artificial Intelligence, pp. 4982–4988 (2019)
Kenton, J.D.M.W.C., Toutanova, L.K.: Bert: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of NAACL-HLT, pp. 4171–4186 (2019)
Lai, Y., Liu, Y., Feng, Y., Huang, S., Zhao, D.: Lattice-bert: leveraging multi-granularity representations in chinese pre-trained language models. In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1716–1731 (2021)
Lan, Z., Chen, M., Goodman, S., Gimpel, K., Sharma, P., Soricut, R.: Albert: a lite bert for self-supervised learning of language representations (2020)
Li, J., et al.: Unified named entity recognition as word-word relation classification. arXiv preprint arXiv:2112.10070 (2021)
Li, X., Feng, J., Meng, Y., Han, Q., Wu, F., Li, J.: A unified MRC framework for named entity recognition. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 5849–5859 (Jul 2020)
Li, Y., Yu, B., Mengge, X., Liu, T.: Enhancing pre-trained Chinese character representation with word-aligned attention. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 3442–3448, July 2020
Liu, Y., et al.: Roberta: a robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692 (2019)
Luo, L., Yang, Z., Song, Y., Li, N., Lin, H.: Chinese clinical named entity recongnition based on stroke elmo and multi-task learning. Chin. J. Comput. 43(10), 1943–1957 (2020)
Sun, Z., et al.: Chinesebert: chinese pretraining enhanced by glyph and pinyin information. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 2065–2075 (2021)
Tang, B., Cao, H., Wu, Y., Jiang, M., Xu, H.: Recognizing clinical entities in hospital discharge summaries using structural support vector machines with word representation features. In: BMC medical informatics and decision making, vol. 13, pp. 1–10. BioMed Central (2013)
Wang, Q., Zhou, Y., Ruan, T., Gao, D., Xia, Y., He, P.: Incorporating dictionaries into deep neural networks for the chinese clinical named entity recognition. J. Biomed. Inf. 92, 103133 (2019)
Wang, Y., Tong, H., Zhu, Z., Li, Y.: Nested named entity recognition: a survey. ACM Trans. Knowl. Discov, Data (feb (2022)
Yu, J., Bohnet, B., Poesio, M.: Named entity recognition as depe, ndency parsing. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. pp. 6470–6476 July 2020
Zhang, N., et al.: Cblue: a chinese biomedical language understanding evaluation benchmark. In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 7888–7915 (2022)
Zhang, Y., Yang, J.: Chinese ner using lattice lstm. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 1554–1564 (2018)
Jinlong, Z., Shi, W.Q.C.: CRF and rules-based recognition of medical institutions name in chinese. Comput. Appl. Softw. 31(3), 159–162 (2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Cong, Q., Feng, Z., Rao, G., Zhang, L. (2023). Chinese Medical Nested Named Entity Recognition Model Based on Feature Fusion and Bidirectional Lattice Embedding Graph. In: Wang, X., et al. Database Systems for Advanced Applications. DASFAA 2023. Lecture Notes in Computer Science, vol 13946. Springer, Cham. https://doi.org/10.1007/978-3-031-30678-5_24
Download citation
DOI: https://doi.org/10.1007/978-3-031-30678-5_24
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-30677-8
Online ISBN: 978-3-031-30678-5
eBook Packages: Computer ScienceComputer Science (R0)