Skip to main content

Efficient Chinese Relation Extraction with Multi-entity Dependency Tree Pruning and Path-Fusion

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2023)

Abstract

Relation Extraction (RE) is a crucial task in natural language processing that aims to predict the relationship between two given entities. In recent years, a large majority of approaches utilized syntactic information, particularly dependency trees, to enhance relation extraction by providing superior semantic guidance. Compared with other fields, Chinese texts are more semantic complex, and contain multiple pairs of entities. However, many studies only focus on removing extraneous information from the dependency tree that pertains to a single entity pair. We hypothesis that preserving the semantic and structural interaction between multiple entity pairs in the tree is more conducive to the identification of the current entity pair relationship. Therefore, we propose a new pruning strategy called Multi-entity dependency Tree Pruning and path-Fusion (MTPF), which preserves the ancestor nodes of each entity pair to their lowest common ancestor, as well as the shortest path from that node to each entity. Then we introduce A-GCN as the encoder for the syntax tree obtained above, and the idea of multi-classification sequence as the decoder. Experimental results on two Chinese benchmark datasets, the financial dataset constructed by ourselves and DUIE1.0, demonstrate the effectiveness of our pruning strategy for CRE, where our approach outperforms strong dependency-tree baselines and achieve state-of-the-art results on both datasets.

This research was supported by the National Nature Science Foundation of China 62172053, National Key R &D Program of China 2021YFC3340700, 2022YFC3303300, 2021YFC3340600, 2022YFC3300800 and the Fundamental Research Funds for the Central Universities 2023RC30.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://www.10jqka.com.cn/.

  2. 2.

    https://github.com/PaddlePaddle/Research.

References

  1. Xu, K., Reddy, S., Feng, Y., Huang, S., Zhao, D.: Question answering on freebase via relation extraction and textual evidence, pp. 2326–2336 (2016)

    Google Scholar 

  2. Lu, W., Cardie, C.: Focused meeting summarization via unsupervised relation extraction. In: Association for Computational Linguistics, pp. 304–313 (2016)

    Google Scholar 

  3. Distiawan, B., Weikum, G., Qi, J., Zhang, R.: Neural relation extraction for knowledge base enrichment. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 229–240 (2019)

    Google Scholar 

  4. Wang, L., Cao, Z., De Melo, G., Liu, Z.: Relation classification via multi-level attention CNNs, pp. 1298–1307 (2016)

    Google Scholar 

  5. Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: International Conference on Computational Linguistics, pp. 2335–2344 (2014)

    Google Scholar 

  6. Devlin, J., Chang, M. W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding, pp. 4171–4186 (2018)

    Google Scholar 

  7. Yang, Z., Dai, Z., Yang, Y., Carbonell, J., Salakhutdinov, R.R., Le, Q.V.: XLNet: generalized autoregressive pretraining for language understanding, pp. 5753–5763 (2019)

    Google Scholar 

  8. Miwa, M., Bansal, M.: End-to-end relation extraction using LSTMs on sequences and tree structures, pp. 1105–1116 (2016)

    Google Scholar 

  9. Zhang, Y., Qi, P., Manning, C.D.: Graph convolution over pruned dependency trees improves relation extraction, pp. 2205–2215 (2018)

    Google Scholar 

  10. Xu, K., Feng, Y., Huang, S., Zhao, D.: Semantic relation classification via convolutional neural networks with simple negative sampling. Comput. Sci. 71(7), 941–9 (2015)

    Google Scholar 

  11. Roth, M., Lapata, M.: Neural semantic role labeling with dependency path embeddings, pp. 1192–1202 (2016)

    Google Scholar 

  12. Marcheggiani, D., Titov, I.: Encoding sentences with graph convolutional networks for semantic role labeling, pp. 1506–1515 (2017)

    Google Scholar 

  13. Guo, Z., Zhang, Y., Lu, W.: Attention guided graph convolutional networks for relation extraction, pp. 241–251 (2019)

    Google Scholar 

  14. Tian, Y., Chen, G., Song, Y., Wan, X.: Dependency-driven relation extraction with attentive graph convolutional networks. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 4458–4471 (2021)

    Google Scholar 

  15. Yu, B., Mengge, X., Zhang, Z., Liu, T., Yubin, W., Wang, B.: Learning to prune dependency trees with rethinking for neural relation extraction. In Proceedings of the 28th International Conference on Computational Linguistics, pp. 3842–3852. Barcelona, Spain (Online) (2020). International Committee on Computational Linguistics

    Google Scholar 

  16. Shi, Y., Li. S., He, W.: DuIE: a large-scale Chinese dataset for information extraction, pp. 791–800 (2019)

    Google Scholar 

  17. Li, Z., Ding, N., Liu, Z., Zheng, H., Shen, Y.: Chinese relation extraction with multi-grained information and external linguistic knowledge. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 4377–4386 (2019)

    Google Scholar 

  18. Parmar, N., et al.: Attention is all you need, pp. 5998–6008 (2017)

    Google Scholar 

  19. Zhang, Z., Liu, T., Yubin, W., Yu, B., Xue, M., Wang, B.: Learning to prune dependency trees with rethinking for neural relation extraction. In: Proceedings of the 28th International Conference on Computational Linguistics, pp. 3842–3852 (2020)

    Google Scholar 

  20. Mandya, A., Bollegala, D., Coenen, F.: Graph convolution over multiple dependency sub-graphs for relation extraction. In: Proceedings of the 28th International Conference on Computational Linguistics, pp. 6424–6435 (2020)

    Google Scholar 

  21. Peng, Z., Wei, S., Tian, J., Qi, Z., Bo, X.: Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 207–212 (2016)

    Google Scholar 

  22. Chen, D., Zhong, Z.: A frustratingly easy approach for joint entity and relation extraction, pp. 50–61 (2021)

    Google Scholar 

  23. Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: International Conference on Computational Linguistics, pp. 2335–2344 (2014)

    Google Scholar 

  24. Wang, L., Cao, Z., De Melo, G., Liu, Z.: Relation classification via multi-level attention CNNs, pp. 1298–1307 (2016)

    Google Scholar 

  25. Wang, J., Lu, W.: Two are better than one: joint entity and relation extraction with table-sequence encoders, pp. 1706–1721 (2020)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Weike You .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Xing, W., You, W., Zhou, L., Yang, Z. (2024). Efficient Chinese Relation Extraction with Multi-entity Dependency Tree Pruning and Path-Fusion. In: Luo, B., Cheng, L., Wu, ZG., Li, H., Li, C. (eds) Neural Information Processing. ICONIP 2023. Communications in Computer and Information Science, vol 1966. Springer, Singapore. https://doi.org/10.1007/978-981-99-8148-9_45

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-8148-9_45

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-8147-2

  • Online ISBN: 978-981-99-8148-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics