Skip to main content

A Unified Platform for Information Extraction with Two-Stage Process

  • Conference paper
  • First Online:
  • 1508 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13029))

Abstract

The multi-format Information Extraction (IE) task in Language and Intelligence Challenge 2021 (LIC2021) consists of three subtasks: Relation Extraction (RE), Sentence-level Event Extraction (SentEE) and Document-level Event Extraction (DocEE). Deep learning methods have made great progress in each subtask these years. However, most of them cannot solve these subtasks by a unified platform. In this paper, we develop a unified neural model with two-stage process, which adopt the Enhanced NER module in stage one to obtained the ELEMENTs and corresponding LABELs. In stage two, we designed the customized manoeuvres to solve challenges in different subtasks. The submission shows that our model achieves competitive performance, which ranks 3rd on the final leaderboard.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    For details of the datasets, please refer to https://aistudio.baidu.com.

References

  1. Zeng, X., Zeng, D., He, S., Kang, L., Zhao, J.: Extracting relational facts by an end-to-end neural model with copy mechanism. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, vol. 1: Long Papers (2018)

    Google Scholar 

  2. Li, S., He, W., Shi, Y., Jiang, W., Zhu, Y.: DuIE: a large-scale Chinese dataset for information extraction. In: Natural Language Processing and Chinese Computing (2019)

    Google Scholar 

  3. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Pre-training of deep bidirectional transformers for language understanding. Bert (2018)

    Google Scholar 

  4. Li, X., Li, F., Pan, L., Chen, Y., Zhu, Y.: DuEE: a large-scale dataset for Chinese event extraction in real-world scenarios. In: 9th CCF International Conference on Natural Language Processing and Chinese Computing, NLPCC, Zhengzhou, China, 14–18 October 2020, Proceedings. Part II (2020)

    Google Scholar 

  5. Cui, Y., Che, W., Liu, T., Qin, B., Wang, S., Hu, G.: Revisiting pre-trained models for chinese natural language processing (2020)

    Google Scholar 

  6. Kiryo, R., Niu, G., du Plessis, M.C., Sugiyama, M.: Positive-unlabeled learning with non-negative risk estimator. arXiv preprint arXiv:1703.00593 (2017)

  7. Gao, T., Yao, X., Chen, D.: Simcse: simple contrastive learning of sentence embeddings. arXiv preprint arXiv:2104.08821 (2021)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Peng Liu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhao, C., Guo, D., Dai, X., Gu, C., Fa, L., Liu, P. (2021). A Unified Platform for Information Extraction with Two-Stage Process. In: Wang, L., Feng, Y., Hong, Y., He, R. (eds) Natural Language Processing and Chinese Computing. NLPCC 2021. Lecture Notes in Computer Science(), vol 13029. Springer, Cham. https://doi.org/10.1007/978-3-030-88483-3_41

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-88483-3_41

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-88482-6

  • Online ISBN: 978-3-030-88483-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics