Skip to main content

Chinese Named Entity Recognition Using the Improved Transformer Encoder and the Lexicon Adapter

  • Conference paper
  • First Online:
Artificial Neural Networks and Machine Learning – ICANN 2022 (ICANN 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13530))

Included in the following conference series:

Abstract

As a basic work of natural language processing (NLP), named entity recognition (NER) has attracted wide attention. Many methods of fusing the potential word representations in a Chinese sentence into the corresponding Chinese character representations have been applied to the Long-Short Term Memory (LSTM) model with good results in the Chinese NER task. However, the structure of LSTM cannot take full advantage of the parallelism of GPUs. Hence, we design a character-word attention adapter in the embedding layer to accelerate information fusion. Recently, the Transformer encoder has been popular in NLP for its parallel computing performance and the advantage of modeling the long-distance context. Nevertheless, the native Transformer encoder performs poorly on the NER task. We have deeply analyzed some of the shortcomings of the Transformer encoder. On these bases, we have further refined the position embedding and the self-attention calculation method in the Transformer encoder. Finally, we propose a new architecture of Chinese NER using the improved Transformer encoder and the lexicon adapter. On the four datasets of the Chinese NER task, our model achieves better performance than other models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Che, W., Wang, M., Manning, C.D., Liu, T.: Named entity recognition with bilingual constraints. In: Proceedings of the 2013 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 52–62 (2013)

    Google Scholar 

  2. Chen, A., Peng, F., Shan, R., Sun, G.: Chinese named entity recognition with conditional probabilistic models. In: Proceedings of the Fifth SIGHAN Workshop on Chinese Language Processing, pp. 173–176 (2006)

    Google Scholar 

  3. Chen, Y., Xu, L., Liu, K., Zeng, D., Zhao, J.: Event extraction via dynamic multi-pooling convolutional neural networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 167–176 (2015)

    Google Scholar 

  4. Gui, T., Ma, R., Zhang, Q., Zhao, L., Jiang, Y.G., Huang, X.: CNN-based Chinese NER with lexicon rethinking. In: IJCAI, pp. 4982–4988 (2019)

    Google Scholar 

  5. He, H., Sun, X.: F-score driven max margin neural network for named entity recognition in Chinese social media. In: Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers, pp. 713–718 (2017)

    Google Scholar 

  6. He, H., Sun, X.: A unified model for cross-domain and semi-supervised named entity recognition in Chinese social media. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 31 (2017)

    Google Scholar 

  7. Levow, G.A.: The third international Chinese language processing bakeoff: word segmentation and named entity recognition. In: Proceedings of the Fifth SIGHAN Workshop on Chinese Language Processing, pp. 108–117 (2006)

    Google Scholar 

  8. Li, X., Yan, H., Qiu, X., Huang, X.: Flat: Chinese NER using flat-lattice transformer. arXiv preprint arXiv:2004.11795 (2020)

  9. Lin, Y., Shen, S., Liu, Z., Luan, H., Sun, M.: Neural relation extraction with selective attention over instances. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 2124–2133 (2016)

    Google Scholar 

  10. Lu, Y., Zhang, Y., Ji, D.: Multi-prototype Chinese character embedding. In: Proceedings of the Tenth International Conference on Language Resources and Evaluation (LREC 2016), pp. 855–859 (2016)

    Google Scholar 

  11. Peng, M., Ma, R., Zhang, Q., Huang, X.: Simplify the usage of lexicon in Chinese NER. In: ACL (2020)

    Google Scholar 

  12. Peng, N., Dredze, M.: Named entity recognition for Chinese social media with jointly trained embeddings. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 548–554 (2015)

    Google Scholar 

  13. Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Advances in Neural Information Processing Systems, pp. 3104–3112 (2014)

    Google Scholar 

  14. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 5998–6008 (2017)

    Google Scholar 

  15. Weischedel, R., et al.: Ontonotes release 4.0. LDC2011T03. Linguistic Data Consortium, Philadelphia (2011)

    Google Scholar 

  16. Wu, P., Zhang, X., Feng, Z.: A survey of question answering over knowledge base. In: Zhu, X., Qin, B., Zhu, X., Liu, M., Qian, L. (eds.) CCKS 2019. CCIS, vol. 1134, pp. 86–97. Springer, Singapore (2019). https://doi.org/10.1007/978-981-15-1956-7_8

    Chapter  Google Scholar 

  17. Yan, H., Deng, B., Li, X., Qiu, X.: Tener: adapting transformer encoder for named entity recognition. arXiv preprint arXiv:1911.04474 (2019)

  18. Yang, J., Teng, Z., Zhang, M., Zhang, Y.: Combining discrete and neural features for sequence labeling. In: Gelbukh, A. (ed.) CICLing 2016. LNCS, vol. 9623, pp. 140–154. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-75477-2_9

    Chapter  Google Scholar 

  19. Zhang, D., Chi, C., Zhan, X.: Leveraging lexical features for Chinese named entity recognition via static and dynamic weighting. IAENG Int. J. Comput. Sci. 48(1) (2021)

    Google Scholar 

  20. Zhang, Y., Yang, J.: Chinese NER using lattice LSTM (2018)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yuhua Huang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Sun, M., Wang, L., Sheng, T., He, Z., Huang, Y. (2022). Chinese Named Entity Recognition Using the Improved Transformer Encoder and the Lexicon Adapter. In: Pimenidis, E., Angelov, P., Jayne, C., Papaleonidas, A., Aydin, M. (eds) Artificial Neural Networks and Machine Learning – ICANN 2022. ICANN 2022. Lecture Notes in Computer Science, vol 13530. Springer, Cham. https://doi.org/10.1007/978-3-031-15931-2_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-15931-2_17

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-15930-5

  • Online ISBN: 978-3-031-15931-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics