Abstract
The insufficient number of tags is currently the biggest constraint on named entity recognition (NER) technology, with only a small number of Registers (means the domain of language, which will be explained in Part I) currently having a corpus with sufficient tags. The linguistic features of different Registers vary greatly, and thus a corpus with sufficient labels cannot be applied to NER in other Registers. In addition, most of the current NER models are more designed for large samples with sufficient labels, and these models do not work well in small samples with a small number of labels. To address the above problems, this paper proposes a model T_NER based on the idea of migration learning and multi-task learning, which learns the common features of language by using the idea of multi-tasking, and passes the model parameters of neurons with common features of language learned from multiple well-labelled source domains to the neurons in the target domain to achieve migration learning based on parameter sharing. In baseline experiments, T_NER’s neurons outperformed the original models such as BiLSTM and BiGRU on a small-sample NER task; in formal experiments, the more the Registers in source domains, the better T_NER’s recognition of the target domain. The experiments demonstrate that T_NER can achieve NER for small samples and across Registers.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Sun, Z., Wang., H.: Overview on the advance of the research on named entity recognition. New technology of library and information service. Title of a Proceedings Paper. In: Conference 2016. LNCS, vol. 9999, pp. 1–13. Springer, Heidelberg (2016)
Li, J., Shang, S., Shao, L.: MetaNER: named entity recognition with meta-learning. In: Proceedings of the 2020 IW3C2 (International World Wide Web Conference Committee) (2020)
Li, D., Yan, L., Yang, J., Ma, Z.: Dependency syntax guided BERT-BiLSTM-GAM-CRF for Chinese NER. Expert Syst. Appl. 196, 116682 (2022)
Zhu, P., Cheng, D., Yang, F., Luo, Y., Qian, W., Zhou, A.: ZH-NER: Chinese named entity recognition with adversarial multi-task learning and self-attentions. In: Jensen, C.S., et al. (eds.) DASFAA 2021. LNCS, vol. 12682, pp. 603–611. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-73197-7_40
Wu, B.-X., Yang, C.-G., Zhong, J.-P.: Research on transfer learning of vision-based gesture recognition. Int. J. Autom. Comput. 18(3), 422–431 (2021). https://doi.org/10.1007/s11633-020-1273-9
Lindner, G., Shi, S., Vuceticb, S., Miskovic, S.: Transfer learning for radioactive particle tracking. Chemical Engineering Science
Durgut, R., Aydin, M.E., Rakib, A.: Transfer learning for operator selection: a reinforcement learning approach. Algorithms 15, 24 (2022)
Peng, D., Wang, Y., Liu, C., Chen, Z.: TL-NER: a transfer learning model for Chinese named entity recognition. Inf. Syst. Front. 22(6), 1291–1304 (2019). https://doi.org/10.1007/s10796-019-09932-y
Kumar, A., Sharaff, A.: NER based biomedical entities association extraction using transfer learning technique. Mater. Sci. Eng. 1022, 012055 (2021)
Wang, Y., Peng, D., Chen, Z., Liu, C.: Trans-NER: a Chinese Named Entity Recognition Model Supported by Transfer Learning. Journal of Chinese Computer Systems
Liu, X., He, P., Chen, W., Gao, J.: Multi-Task Deep Neural Networks for Natural Language Understanding. On the latest GLUE test set (2019)
Fei, Z., et al.: A method of short text entity linking based on multi-task learning. Computer Engineering
Delu, Z.: Introduction to Domain Theory. Modern Foreign Languages (1987)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Ma, H., Ding, Z., Zhou, D., Wang, J., Niu, S. (2022). Research on NER Based on Register Migration and Multi-task Learning. In: Wang, L., Segal, M., Chen, J., Qiu, T. (eds) Wireless Algorithms, Systems, and Applications. WASA 2022. Lecture Notes in Computer Science, vol 13473. Springer, Cham. https://doi.org/10.1007/978-3-031-19211-1_55
Download citation
DOI: https://doi.org/10.1007/978-3-031-19211-1_55
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-19210-4
Online ISBN: 978-3-031-19211-1
eBook Packages: Computer ScienceComputer Science (R0)