Skip to main content

Document-Level Relation Extraction with a Dependency Syntax Transformer and Supervised Contrastive Learning

  • Conference paper
  • First Online:
Knowledge Graph and Semantic Computing: Knowledge Graph Empowers the Digital Economy (CCKS 2022)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1669))

Included in the following conference series:

  • 662 Accesses

Abstract

Document-level Relation Extraction is more challenging than its sentence-level counterpart, extracting unknown relational facts from a plain text at the document level. Studies have shown that the Transformer architecture models long-distance dependencies without regard to the syntax-level dependencies between tokens in the sequence, which hinders its ability to model long-range dependencies. Furthermore, the global information among relational triples and local information around entities is critical. In this paper, we propose a Dependency Syntax Transformer and Supervised Contrastive Learning model (DSTSC) for document-level relation extraction. Specifically, dependency syntax information guides Transformer to enhance attention between tokens with dependency syntax relation in the sequence. The ability of Transformer to model document-level dependencies is improved. Supervised contrastive learning with fusion knowledge captures global information among relational triples. Gaussian probability distributions are also designed to capture local information around entities. Our experiments on two document-level relation extraction datasets, CDR and GDA, have remarkable results.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://stanfordnlp.github.io/CoreNLP/.

References

  1. Beltagy, I., Lo, K., Cohan, A.: SciBERT: a pretrained language model for scientific text. In: EMNLP-IJCNLP 2019, 3–7 November 2019, pp. 3613–3618 (2019). https://doi.org/10.18653/v1/D19-1371

  2. Christopoulou, F., Miwa, M., Ananiadou, S.: Connecting the dots: document-level neural relation extraction with edge-oriented graphs. In: EMNLP-IJCNLP 2019, pp. 4924–4935 (2019). https://doi.org/10.18653/v1/D19-1498

  3. Guo, Z., Zhang, Y., Lu, W.: Attention guided graph convolutional networks for relation extraction. In: ACL 2019, pp. 241–251 (2019). https://doi.org/10.18653/v1/P19-1024

  4. Kambhatla, N.: Combining lexical, syntactic, and semantic features with maximum entropy models for information extraction. In: Proceedings of the ACL Interactive Poster and Demonstration Sessions, pp. 178–181 (2004). https://aclanthology.org/P04-3022/

  5. Khosla, P., et al.: Supervised contrastive learning. In: NIPS 2020, vol. 33, pp. 18661–18673 (2020). https://proceedings.neurips.cc/paper/2020/hash/d89a66c7c80a29b1bdbab0f2a1a94af8-Abstract.html

  6. Le, H., Can, D., Dang, T.H., Tran, M., Ha, Q., Collier, N.: Improving chemical-induced disease relation extraction with learned features based on convolutional neural network. In: KSE, pp. 292–297 (2017). https://doi.org/10.1109/KSE.2017.8119474

  7. Li, Z., Yang, Z., Xiang, Y., Luo, L., Sun, Y., Lin, H.: Exploiting sequence labeling framework to extract document-level relations from biomedical texts. BMC Bioinform. 21(1), 125 (2020). https://doi.org/10.1186/s12859-020-3457-2

    Article  Google Scholar 

  8. Li, Z., Sun, Y., Zhu, J., Tang, S., Zhang, C., Ma, H.: Improve relation extraction with dual attention-guided graph convolutional networks. Neural Comput. Appl. 33(6), 1773–1784 (2021). https://doi.org/10.1007/s00521-020-05087-z

    Article  Google Scholar 

  9. Nan, G., Guo, Z., Sekulic, I., Lu, W.: Reasoning with latent structure refinement for document-level relation extraction. In: ACL 2020, 5–10 July 2020, pp. 1546–1557 (2020). https://doi.org/10.18653/v1/2020.acl-main.141

  10. Su, P., Peng, Y., Vijay-Shanker, K.: Improving BERT model using contrastive learning for biomedical relation extraction. In: BioNLP@NAACL-HLT 2021, 11 June 2021, pp. 1–10 (2021). https://doi.org/10.18653/v1/2021.bionlp-1.1

  11. Sun, C., et al.: Chemical-protein interaction extraction via Gaussian probability distribution and external biomedical knowledge. Bioinformatics 36(15), 4323–4330 (2020). https://doi.org/10.1093/bioinformatics/btaa491

    Article  Google Scholar 

  12. Vaswani, A., et al.: Attention is all you need. In: NIPS 2017, pp. 5998–6008 (2017). https://proceedings.neurips.cc/paper/2017/hash/3f5ee243547dee91fbd053c1c4a845aa-Abstract.html

  13. Verga, P., Strubell, E., McCallum, A.: Simultaneously self-attending to all mentions for full-abstract biological relation extraction. In: NAACL-HLT, pp. 872–884 (2018). https://doi.org/10.18653/v1/n18-1080

  14. Zheng, W., Lin, H., Liu, X., Xu, B.: A document level neural model integrated domain knowledge for chemical-induced disease relations. BMC Bioinform. 19(1), 1–12 (2018). https://doi.org/10.1186/s12859-018-2316-x

    Article  Google Scholar 

  15. Zhou, H., Ning, S., Yang, Y., Liu, Z., Lang, C., Lin, Y.: Chemical-induced disease relation extraction with dependency information and prior knowledge. CoRR (2020). http://arxiv.org/abs/2001.00295

  16. Zhou, W., Huang, K., Ma, T., Huang, J.: Document-level relation extraction with adaptive thresholding and localized context pooling. In: AAAI 2021, vol. 35, pp. 14612–14620 (2021). https://ojs.aaai.org/index.php/AAAI/article/view/17717

Download references

Acknowledgment

This work is supported by grant from the Natural Science Foundation of China (No. 62072070) and Social and Science Foundation of Liaoning Province (No. L20BTQ008).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yijia Zhang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yang, M., Zhang, Y., Banbhrani, S.K., Lin, H., Lu, M. (2022). Document-Level Relation Extraction with a Dependency Syntax Transformer and Supervised Contrastive Learning. In: Sun, M., et al. Knowledge Graph and Semantic Computing: Knowledge Graph Empowers the Digital Economy. CCKS 2022. Communications in Computer and Information Science, vol 1669. Springer, Singapore. https://doi.org/10.1007/978-981-19-7596-7_4

Download citation

  • DOI: https://doi.org/10.1007/978-981-19-7596-7_4

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-19-7595-0

  • Online ISBN: 978-981-19-7596-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics