Skip to main content

DocBAN: An Efficient Biaffine Attention Network for Document-Level Named Entity Recognition

  • Conference paper
  • First Online:
Advanced Intelligent Computing Technology and Applications (ICIC 2024)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 14877))

Included in the following conference series:

  • 507 Accesses

Abstract

Named entity recognition (NER) aims to identify prefined types of entities from the given text sequence. When the sequence is composed of multiple sentences, this problem is referred as document-level NER. Recently, most span-based methods exploit the biaffine attention to get a n × n score matrix, where n is the length of sequence and each entry denotes a span representation. However, when dealing with long sequences, such as doc-level NER, those models exhibit low efficiency due to enumerating all spans via Biaffine Attention Network (BAN), which scales quadratically with the sequence length. To address this limitation, we propose an efficient BAN for doc-level NER, called DocBAN, which scales linearly with sequence length and can be regarded as a parallel alternative to the standard biaffine attention. Specifically, we reduce its time and space complexity by introducing a sliding window mechanism, and use the U-Net to capture global features of multiple windows. Extensive experiments demonstrate that DocBAN we proposed, serving as an alternative for BAN, can significantly improve the efficiency of existing span-based methods while maintaining competitive performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Deng, J., Qin, X., Yang, R., Lv, X.: Rope-bam: nested entity recognition based on rotary position embedding and biaffine attention mechanism. In: Third International Conference on Advanced Algorithms and Neural Networks (AANN 2023), vol. 12791, pp. 231–239. SPIE (2023)

    Google Scholar 

  2. Dong, C., Zhang, J., Zong, C., Hattori, M., Di, H.: Character-based lstm-crf with radical-level features for chinese named entity recognition. In: Lin, C.-Y., Xue, N., Zhao, D., Huang, X., Feng, Y. (eds.) ICCPOL/NLPCC -2016. LNCS (LNAI), vol. 10102, pp. 239–250. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-50496-4_20

    Chapter  Google Scholar 

  3. Dozat, T., Manning, C.D.: Deep biaffine attention for neural dependency parsing. In: International Conference on Learning Representations (2016)

    Google Scholar 

  4. Xiaojing, D., Jia, Y., Zan, H.: MRC-Based Medical NER with Multi-task Learning and Multi-strategies. In: Sun, M., et al., (eds.) Chinese Computational Linguistics: 21st China National Conference, CCL 2022, Nanchang, China, October 14–16, 2022, Proceedings, pp. 149–162. Springer International Publishing, Cham (2022). https://doi.org/10.1007/978-3-031-18315-7_10

  5. Fisher, J., Vlachos, A.: Merge and label: a novel neural network architecture for nested ner. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics (2019)

    Google Scholar 

  6. Ju, M., Miwa, M., Ananiadou, S.: A neural layered model for nested named entity recognition. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (Long Papers), pp. 1446–1459 (2018)

    Google Scholar 

  7. Katiyar, A., Cardie, C.: Nested named entity recognition revisited. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (2018)

    Google Scholar 

  8. Kenton, J.D.M.W.C., Toutanova, L.K.: Bert: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of NAACL-HLT, pp. 4171–4186 (2019)

    Google Scholar 

  9. Lample, G., Ballesteros, M., Subramanian, S., Kawakami, K., Dyer, C.: Neural architectures for named entity recognition. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 260–270 (2016)

    Google Scholar 

  10. Li, J., et al.: Unified named entity recognition as word-word relation classification. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 10965–10973 (2022)

    Google Scholar 

  11. Li, X., Feng, J., Meng, Y., Han, Q., Wu, F., Li, J.: A unified mrc framework for named entity recognition. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 5849–5859 (2020)

    Google Scholar 

  12. Lu, W., Roth, D.: Joint mention extraction and classification with mention hypergraphs. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 857–867 (2015)

    Google Scholar 

  13. Ohta, T., Tateisi, Y., Kim, J.D., Mima, H., Tsujii, J.: The genia corpus: an annotated research abstract corpus in molecular biology domain. In: Proceedings of the Human Language Technology Conference, pp. 73–77. Citeseer (2002)

    Google Scholar 

  14. Ronneberger, O., Fischer, P., Brox, T.: U-net: convolutional networks for biomedical image segmentation. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9351, pp. 234–241. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-24574-4_28

    Chapter  Google Scholar 

  15. Sang, E.F., De Meulder, F.: Introduction to the conll-2003 shared task: Language independent named entity recognition. arXiv preprint cs/0306050 (2003)

    Google Scholar 

  16. Shen, Y., Song, K., Tan, X., Li, D., Lu, W., Zhuang, Y.: Diffusionner: boundary diffusion for named entity recognition. In: The 61st Annual Meeting of the Association For Computational Linguistics (2023)

    Google Scholar 

  17. Shen, Y., et al.: Parallel instance query network for named entity recognition. In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics, vol. 1: Long Papers, pp. 947–961 (2022)

    Google Scholar 

  18. Sohrab, M.G., Miwa, M.: Deep exhaustive model for nested named entity recognition. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 2843–2849 (2018)

    Google Scholar 

  19. Tan, Z., Shen, Y., Zhang, S., Lu, W., Zhuang, Y.: A sequence-to-set network for nested named entity recognition. arXiv preprint arXiv:2105.08901 (2021)

  20. Tran, Q.H., MacKinlay, A., Yepes, A.J.: Named entity recognition with stack residual LSTM and trainable bias decoding. In: Proceedings of the Eighth International Joint Conference on Natural Language Processing, vol. 1: Long Papers, pp. 566–575 (2017)

    Google Scholar 

  21. Wang, J., Shou, L., Chen, K., Chen, G.: Pyramid: a layered model for nested named entity recognition. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 5918–5928 (2020)

    Google Scholar 

  22. Yan, H., Gui, T., Dai, J., Guo, Q., Zhang, Z., Qiu, X.: A unified generative framework for various ner subtasks. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, vol. 1: Long Papers, pp. 5808–5822 (2021)

    Google Scholar 

  23. Yan, H., Sun, Y., Li, X., Qiu, X.: An embarrassingly easy but strong baseline for nested named entity recognition. In: The 61st Annual Meeting of the Association for Computational Linguistics (2023)

    Google Scholar 

  24. Yu, J., Bohnet, B., Poesio, M.: Named entity recognition as dependency parsing. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 6470–6476 (2020)

    Google Scholar 

  25. Žukov-Gregoriě, A., Bachrach, Y., Coope, S.: Named entity recognition with parallel recurrent neural networks. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, vol. 2: Short Papers, pp. 69–74 (2018)

    Google Scholar 

Download references

Acknowledgement

The research was supported in part by the Guangxi Science and Technology Major Project (No. AA22068070), the National Natural Science Foundation of China (Nos. 62166004,U21A20474), the Key Lab of Education Blockchain and Intelligent Technology, the Center for Applied Mathematics of Guangxi, the Guangxi “Bagui Scholar” Teams for Innovation and Research Project, the Guangxi Talent Highland Project of Big Data Intelligence and Application, the Guangxi Collaborative Center of Multisource Information Integration and Intelligent Processing.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Peng Liu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wu, H., Li, X., Yang, D., Zhou, A., Wang, P., Liu, P. (2024). DocBAN: An Efficient Biaffine Attention Network for Document-Level Named Entity Recognition. In: Huang, DS., Si, Z., Zhang, Q. (eds) Advanced Intelligent Computing Technology and Applications. ICIC 2024. Lecture Notes in Computer Science(), vol 14877. Springer, Singapore. https://doi.org/10.1007/978-981-97-5669-8_6

Download citation

  • DOI: https://doi.org/10.1007/978-981-97-5669-8_6

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-97-5668-1

  • Online ISBN: 978-981-97-5669-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics