Skip to main content

Dual Hierarchical Contrastive Learning for Multi-level Implicit Discourse Relation Recognition

  • Conference paper
  • First Online:
Natural Language Processing and Chinese Computing (NLPCC 2023)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 14303))

  • 787 Accesses

Abstract

Implicit discourse relation recognition (IDRR) is a challenging but vital task in discourse analysis, which aims at classifying logical relations between arguments. Previous work infuses ACCL takes external knowledge or label semantics to alleviate data scarcity, which either brings noise or underutilizes semantic information contained in label embedding. Meanwhile, it is difficult to model label hierarchy. In this paper, we make full use of label embedding as positives and negatives for our dual hierarchical contrastive learning framework, which contains two parts: 1) hierarchical label contrastive loss (HLCL), which promotes fine-grained labels to be more similar to correlative medium-grained labels than related coarse-grained labels. 2) arguments and connectives contrastive loss (ACCL), which makes arguments aggregate around correlative fine-grained labels. The two modules interact with each other, making the similarity between arguments and correlative fine-grained labels are higher than that with related coarse-grained labels. In this process, the multi-level label semantics are integrated to arguments, which provides guidance for classification. Experimental results show that our method achieves competitive performance against state-of-the-art system.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Chen, Q., Zhang, R., Zheng, Y., Mao, Y.: Dual contrastive learning: text classification via label-aware data augmentation. arXiv preprint arXiv:2201.08702 (2022)

  2. Dou, Z., Hong, Y., Sun, Y., Zhou, G.: CVAE-based re-anchoring for implicit discourse relation classification. In: Findings of the 2021 EMNLP, pp. 1275–1283 (2021)

    Google Scholar 

  3. Fang, H., Wang, S., Zhou, M., Ding, J., Xie, P.: CERT: contrastive self-supervised learning for language understanding. arXiv preprint arXiv:2005.12766 (2020)

  4. Gao, T., Yao, X., Chen, D.: SimCSE: simple contrastive learning of sentence embeddings. In: Proceedings of the 2021 EMNLP, pp. 6894–6910 (2021)

    Google Scholar 

  5. Gerani, S., Mehdad, Y., Carenini, G., Ng, R.T., Nejat, B.: Abstractive summarization of product reviews using discourse structure. In: Proceedings of the 2014 EMNLP, pp. 1602–1613 (2014)

    Google Scholar 

  6. Guo, F., He, R., Dang, J., Wang, J.: Working memory-driven neural networks with a novel knowledge enhancement paradigm for implicit discourse relation recognition. In: Proceedings of the 34th AAAI, pp. 7822–7829 (2020)

    Google Scholar 

  7. Guzmán, F., Joty, S., Màrquez, L., Nakov, P.: Using discourse structure improves machine translation evaluation. In: Proceedings of the 52th ACL, pp. 687–698 (2014)

    Google Scholar 

  8. He, R., Wang, J., Guo, F., Han, Y.: TransS-driven joint learning architecture for implicit discourse relation recognition. In: Proceedings of the 58th ACL, pp. 139–148 (2020)

    Google Scholar 

  9. Ji, Y., Eisenstein, J.: Representation learning for text-level discourse parsing. In: Proceedings of the 53th ACL, pp. 13–24 (2014)

    Google Scholar 

  10. Ji, Y., Eisenstein, J.: One vector is not enough: Entity-augmented distributed semantics for discourse relations. In: Proceedings of the 54th ACL, pp. 329–344 (2015)

    Google Scholar 

  11. Jiang, F., Fan, Y., Chu, X., Li, P., Zhu, Q.: Not just classification: recognizing implicit discourse relation on joint modeling of classification and generation. In: Proceedings of the 2021 EMNLP, pp. 2418–2431 (2021)

    Google Scholar 

  12. Khosla, P., et al.: Supervised contrastive learning. In: Advances in Neural Information Processing Systems, pp. 18661–18673 (2020)

    Google Scholar 

  13. Kim, N., Feng, S., Gunasekara, C., Lastras, L.: Implicit discourse relation classification: we need to talk about evaluation. In: Proceedings of the 58th ACL, pp. 5404–5414 (2020)

    Google Scholar 

  14. Kishimoto, Y., Murawaki, Y., Kurohashi, S.: A knowledge-augmented neural network model for implicit discourse relation classification. In: Proceedings of the 27th COLING, pp. 584–595 (2018)

    Google Scholar 

  15. Lan, M., Wang, J., Wu, Y., Niu, Z.Y., Wang, H.: Multi-task attention-based neural networks for implicit discourse relationship representation and identification. In: Proceedings of the 2017 EMNLP, pp. 1299–1308 (2017)

    Google Scholar 

  16. Lei, W., Xiang, Y., Wang, Y., Zhong, Q., Liu, M., Kan, M.: Linguistic properties matter for implicit discourse relation recognition: combining semantic interaction, topic continuity and attribution. In: Proceedings of the 32th AAAI, pp. 4848–4855 (2018)

    Google Scholar 

  17. Liu, X., Ou, J., Song, Y., Jiang, X.: On the importance of word and sentence representation learning in implicit discourse relation classification. In: Proceedings of the 29th IJCAI, pp. 3830–3836 (2020)

    Google Scholar 

  18. Liu, Y., Li, S.: Recognizing implicit discourse relations via repeated reading: neural networks with multi-level attention. In: Proceedings of the 2016 EMNLP, pp. 1224–1233 (2016)

    Google Scholar 

  19. Liu, Y., et al.: RoBERTa: a robustly optimized BERT pretraining approach. arXiv preprint arXiv:1907.11692 (2019)

  20. Long, W., Webber, B.: Facilitating contrastive learning of discourse relational senses by exploiting the hierarchy of sense relations. In: Proceedings of the 2022 EMNLP (2022)

    Google Scholar 

  21. Pitler, E., Nenkova, A.: Using syntax to disambiguate explicit discourse connectives in text. In: Proceedings of the 47th ACL IJCNLP, pp. 13–16 (2009)

    Google Scholar 

  22. Prasad, R., et al.: The Penn Discourse TreeBank 2.0. In: LREC (2008)

    Google Scholar 

  23. Prasad, R., Webber, B., Lee, A., Joshi, A.: Penn discourse treebank version 3.0. In: LDC2019T05. Linguistic Data Consortium, Philadelphia (2019)

    Google Scholar 

  24. Qin, L., Zhang, Z., Zhao, H.: Shallow discourse parsing using convolutional neural network. In: CoNLL Shared Task, pp. 70–77 (2016)

    Google Scholar 

  25. Ruan, H., Hong, Y., Xu, Y., Huang, Z., Zhou, G., Zhang, M.: Interactively-propagative attention learning for implicit discourse relation recognition. In: Proceedings of the 29th COLING, pp. 3168–3178 (2020)

    Google Scholar 

  26. Tang, J., et al.: From discourse to narrative: Knowledge projection for event relation extraction. In: Proceedings of the 59th ACL, pp. 732–742 (2021)

    Google Scholar 

  27. Wang, X., Li, S., Li, J., Li, W.: Implicit discourse relation recognition by selecting typical training examples. In: Proceedings of the 21th COLING, pp. 2757–2772 (2012)

    Google Scholar 

  28. Wang, Z., Wang, P., Huang, L., Sun, X., Wang, H.: Incorporating hierarchy into text encoder: a contrastive learning approach for hierarchical text classification. In: Proceedings of the 60th ACL, pp. 7109–7119 (2022)

    Google Scholar 

  29. Wu, C., Cao, L., Ge, Y., Liu, Y., Zhang, M., Su, J.: A label dependence-aware sequence generation model for multi-level implicit discourse relation recognition. In: Proceedings of the 36th AAAI, pp. 11486–11494 (2022)

    Google Scholar 

  30. Wu, C., Su, J., Chen, Y., Shi, X.: Boosting implicit discourse relation recognition with connective-based word embeddings. Neurocomputing 369, 39–49 (2019)

    Article  Google Scholar 

  31. Xiang, W., Wang, B., Dai, L., Mo, Y.: Encoding and fusing semantic connection and linguistic evidence for implicit discourse relation recognition. In: Findings of the 60th ACL, pp. 3247–3257 (2022)

    Google Scholar 

  32. Xiang, W., Wang, Z., Dai, L., Wang, B.: ConnPrompt: connective-cloze prompt learning for implicit discourse relation recognition. In: Proceedings of the 31th COLING, pp. 902–911 (2022)

    Google Scholar 

  33. Zhang, Z., Zhao, Y., Chen, M., He, X.: Label anchored contrastive learning for language understanding. In: Proceedings of the 2022 NAACL, pp. 1437–1449 (2022)

    Google Scholar 

  34. Zhou, H., Lan, M., Wu, Y., Chen, Y., Ma, M.: Prompt-based connective prediction method for fine-grained implicit discourse relation recognition. In: Findings of the 2022 EMNLP, pp. 3848–3858 (2022)

    Google Scholar 

  35. Zhou, P., et al.: Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th ACL, pp. 207–212 (2016)

    Google Scholar 

Download references

Acknowledge

Our work is supported by the National Natural Science Foundation of China (61976154) and CAAI-Huawei MindSpore Open Fund.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ruifang He .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Xu, J., He, R., Zhao, H., Wang, H., Zeng, L. (2023). Dual Hierarchical Contrastive Learning for Multi-level Implicit Discourse Relation Recognition. In: Liu, F., Duan, N., Xu, Q., Hong, Y. (eds) Natural Language Processing and Chinese Computing. NLPCC 2023. Lecture Notes in Computer Science(), vol 14303. Springer, Cham. https://doi.org/10.1007/978-3-031-44696-2_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-44696-2_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-44695-5

  • Online ISBN: 978-3-031-44696-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics