Skip to main content

Research on NER Based on Register Migration and Multi-task Learning

  • Conference paper
  • First Online:
Wireless Algorithms, Systems, and Applications (WASA 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13473))

  • 1200 Accesses

Abstract

The insufficient number of tags is currently the biggest constraint on named entity recognition (NER) technology, with only a small number of Registers (means the domain of language, which will be explained in Part I) currently having a corpus with sufficient tags. The linguistic features of different Registers vary greatly, and thus a corpus with sufficient labels cannot be applied to NER in other Registers. In addition, most of the current NER models are more designed for large samples with sufficient labels, and these models do not work well in small samples with a small number of labels. To address the above problems, this paper proposes a model T_NER based on the idea of migration learning and multi-task learning, which learns the common features of language by using the idea of multi-tasking, and passes the model parameters of neurons with common features of language learned from multiple well-labelled source domains to the neurons in the target domain to achieve migration learning based on parameter sharing. In baseline experiments, T_NER’s neurons outperformed the original models such as BiLSTM and BiGRU on a small-sample NER task; in formal experiments, the more the Registers in source domains, the better T_NER’s recognition of the target domain. The experiments demonstrate that T_NER can achieve NER for small samples and across Registers.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Sun, Z., Wang., H.: Overview on the advance of the research on named entity recognition. New technology of library and information service. Title of a Proceedings Paper. In: Conference 2016. LNCS, vol. 9999, pp. 1–13. Springer, Heidelberg (2016)

    Google Scholar 

  2. Li, J., Shang, S., Shao, L.: MetaNER: named entity recognition with meta-learning. In: Proceedings of the 2020 IW3C2 (International World Wide Web Conference Committee) (2020)

    Google Scholar 

  3. Li, D., Yan, L., Yang, J., Ma, Z.: Dependency syntax guided BERT-BiLSTM-GAM-CRF for Chinese NER. Expert Syst. Appl. 196, 116682 (2022)

    Article  Google Scholar 

  4. Zhu, P., Cheng, D., Yang, F., Luo, Y., Qian, W., Zhou, A.: ZH-NER: Chinese named entity recognition with adversarial multi-task learning and self-attentions. In: Jensen, C.S., et al. (eds.) DASFAA 2021. LNCS, vol. 12682, pp. 603–611. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-73197-7_40

    Chapter  Google Scholar 

  5. Wu, B.-X., Yang, C.-G., Zhong, J.-P.: Research on transfer learning of vision-based gesture recognition. Int. J. Autom. Comput. 18(3), 422–431 (2021). https://doi.org/10.1007/s11633-020-1273-9

    Article  Google Scholar 

  6. Lindner, G., Shi, S., Vuceticb, S., Miskovic, S.: Transfer learning for radioactive particle tracking. Chemical Engineering Science

    Google Scholar 

  7. Durgut, R., Aydin, M.E., Rakib, A.: Transfer learning for operator selection: a reinforcement learning approach. Algorithms 15, 24 (2022)

    Article  Google Scholar 

  8. Peng, D., Wang, Y., Liu, C., Chen, Z.: TL-NER: a transfer learning model for Chinese named entity recognition. Inf. Syst. Front. 22(6), 1291–1304 (2019). https://doi.org/10.1007/s10796-019-09932-y

    Article  Google Scholar 

  9. Kumar, A., Sharaff, A.: NER based biomedical entities association extraction using transfer learning technique. Mater. Sci. Eng. 1022, 012055 (2021)

    Google Scholar 

  10. Wang, Y., Peng, D., Chen, Z., Liu, C.: Trans-NER: a Chinese Named Entity Recognition Model Supported by Transfer Learning. Journal of Chinese Computer Systems

    Google Scholar 

  11. Liu, X., He, P., Chen, W., Gao, J.: Multi-Task Deep Neural Networks for Natural Language Understanding. On the latest GLUE test set (2019)

    Google Scholar 

  12. Fei, Z., et al.: A method of short text entity linking based on multi-task learning. Computer Engineering

    Google Scholar 

  13. Delu, Z.: Introduction to Domain Theory. Modern Foreign Languages (1987)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhaoyun Ding .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ma, H., Ding, Z., Zhou, D., Wang, J., Niu, S. (2022). Research on NER Based on Register Migration and Multi-task Learning. In: Wang, L., Segal, M., Chen, J., Qiu, T. (eds) Wireless Algorithms, Systems, and Applications. WASA 2022. Lecture Notes in Computer Science, vol 13473. Springer, Cham. https://doi.org/10.1007/978-3-031-19211-1_55

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-19211-1_55

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-19210-4

  • Online ISBN: 978-3-031-19211-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics