skip to main content
10.1145/3487075.3487156acmotherconferencesArticle/Chapter ViewAbstractPublication PagescsaeConference Proceedingsconference-collections
research-article

Implicit Discourse Relation Classification Based on Semantic Graph Attention Networks

Published: 07 December 2021 Publication History

Abstract

Theimplicit discourse relation classification is of great importance to discourse analysis. It aims to identify the logical relation between sentence pair. Compared with the linear network model, the graph neural network has a more complex structure to capture cross-sentence interactions. Therefore, this article proposes a semantic graph neural network for implicit discourse relation classification. Specifically, we design a semantic graph to describe the syntactic structure of sentences and semantic interactions between sentence pair. Then, convolutional neural network (CNN) with different convolutional kernels to extract the multi-granularity semantic features. The experimental results on Penn Discourse TreeBank 2.0 (PDTB 2.0) prove that our work performed well.

References

[1]
J. L Li, M. carpuat, and A. Nenkova (2014). Assessing the discourse factors that influence the quality of machine translation. Association for Computational Linguistics, Baltimore, Maryland, USA, 283-388.
[2]
P. Jansen, M. Surdeanu, and P. Clark (2014). Discourse complements lexical semantics for non-factoid answer reranking. Association for Computational Linguistics, ACL, Baltimore, Maryland, USA, 977-986.
[3]
A. Cohan, F. Dernoncourt, S. Kim, W. Chang, and N. Goharian (2018). A discourse-aware attention model for abstractive summarization of long documents. arXiv preprint arXiv:1804.05685.
[4]
E. Pitler, A. Louis, A. Nenkova (2009). Automatic sense prediction for implicit discourse relations in text. Association for Computational Linguistics and International Joint Conference on Natural Language Processing, Suntec, Singapore, 683-691.
[5]
Y. Liu, S. Li (2016). Recognizing implicit discourse relations via repeated reading: neural networks with multi-level attention. Empirical Methods in Natural Language Processing, Austin, Texas, USA, 1224-1233.
[6]
J. Chen, Q. Zhang, P. Liu, X. Qiu, and X. Huang (2016). Implicit discourse relation detection via a deep architecture with gated relevance network, Association for Computational Linguistics, Berlin, Germany, 1726-1735.
[7]
P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio (2018). Graph attention networks. International Conference on Learning Representations, Vancouver, BC, Canada.
[8]
W. Shi, V. Demberg (2019). Next sentence prediction helps implicit discourse relation classification within and across domains. Empirical Methods in Natural Language Processing and International Joint Conference on Natural Language Processing, Hong Kong, China, 5790-5796.
[9]
J. Devlin, M.W. Chang, K Lee, and K Toutanova (2019). BERT: pre-training of deep bidirectional transformers for language understanding. North American Chapter of the Association for Computational Linguistics, Minneapolis, Minnesota, USA, 4171-4186.
[10]
M. Peters, M. Neumann, M. Iyyer. M. Gardner, et, al. (2018). Deep contextualized word representations. North American Chapter of the Association for Computational Linguistics, Minneapolis, Minnesota, USA, 2227-2237.
[11]
X. Liu, J. Ou, Y. Song, and X. Jiang (2020). On the importance of word and sentence representation learning in implicit discourse relation classification. proceedings of International Joint Conference on Artificial Intelligence, Yokohama, Japan.
[12]
K. Sun, Y. Li, D. Deng, and Y. Li (2019). Mulit-Channel CNN based inner-attention for compound sentence relation classification. IEEE Access. 14801-14809.
[13]
R. Girdhar, D. Ramanan (2017). Attentional pooling for action recognition. Proceeding of Advances in Neural Information Processing Systems, Long Beach, CA, USA.
[14]
R. Prasad, N. Dinesh, A. Lee, E. Miltsakaki, L. Robaldo, A. Joshi, and B. Webber (2008). The penn discourse TreeBank 2.0. International Conference on Language Resources and Evaluation, 24, 2961–2968.
[15]
Y. Ji, J. Eisenstein (2015). One vector is not enough: Entity-augmented distributed semantics for discourse relations. Transactions of the Association for Computational Linguistics. Beijing, China, 329-344.
[16]
M, Lan, J. Wang, Y. Wu, Z. Niu, and H. Wang (2017). Multi-task attention-based neural networks for implicit discourse relationship representation and identification. Proceedings of Empirical Methods in Natural Language Processing, Copenhagen, Denmark, 1299-1308.
[17]
Z. Dai, R. Huang (2018). Improving discourse relation classification by modeling inter-dependencies of discourse units in a paragraph. Proceedings of North American Chapter of the Association for Computational Linguistics, New Orleans, Louisiana, USA, 141-151.
[18]
S. Varia, C. Hidey, T. Chakrabarty (2019). Discourse relation prediction: revisiting word pairs with convolutional networks. SIGdial Meeting on Discourse and Dialogue, Stockholm, Sweden, 442-452.
[19]
H Bai, H Zhao, J Zhao (2019). Memorizing all for implicit discourse relation recognition. arXiv preprint arXiv:1908.11317.
[20]
Z. Dai, R. Huang (2019). A regularization approach for incorporating event knowledge and coreference relations into neural discourse parsing. Empirical Methods in Natural Language Processing and International Joint Conference on Natural Language Processing. Hong Kong, China, 2976-2987.
[21]
Y. Kishimoto, Y. Murawaki, S. Kurohashi (2020). Adapting BERT to implicit discourse relation classification with a focus on discourse connectives. Conference on Language Resources and Evaluation, Marseille, France, 1152-1158.

Index Terms

  1. Implicit Discourse Relation Classification Based on Semantic Graph Attention Networks
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Information & Contributors

          Information

          Published In

          cover image ACM Other conferences
          CSAE '21: Proceedings of the 5th International Conference on Computer Science and Application Engineering
          October 2021
          660 pages
          ISBN:9781450389853
          DOI:10.1145/3487075
          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          Published: 07 December 2021

          Permissions

          Request permissions for this article.

          Check for updates

          Author Tags

          1. Discourse parsing
          2. Graph attention network
          3. Implicit discourse relation classification
          4. Semantic interaction

          Qualifiers

          • Research-article
          • Research
          • Refereed limited

          Conference

          CSAE 2021

          Acceptance Rates

          Overall Acceptance Rate 368 of 770 submissions, 48%

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • 0
            Total Citations
          • 82
            Total Downloads
          • Downloads (Last 12 months)11
          • Downloads (Last 6 weeks)1
          Reflects downloads up to 20 Jan 2025

          Other Metrics

          Citations

          View Options

          Login options

          View options

          PDF

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader

          HTML Format

          View this article in HTML Format.

          HTML Format

          Media

          Figures

          Other

          Tables

          Share

          Share

          Share this Publication link

          Share on social media