Skip to main content

A Recursive Information Flow Gated Model for RST-Style Text-Level Discourse Parsing

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11839))

Abstract

Text-level discourse parsing is notoriously difficult due to the long-distance dependency over the document and the deep hierarchical structure of the discourse. In this paper, we attempt to model the representation of a document recursively via shift-reduce operations. Intuitively, humans tend to understand macro and micro texts from different perspectives, so we propose a recursive model to fuse multiple information flows and strengthen the representation of text spans. During parsing, the proposed model can synthetically grade each information flow according to the granularity of the text. Experimentation on the RST-DT corpus shows that our parser can outperform the state-of-the-art in nuclearity detection under stringent discourse parsing evaluations.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Arora, S., Liang, Y., Ma, T.: A simple but tough-to-beat baseline for sentence embeddings. In: Proceedings of ICLR (2016)

    Google Scholar 

  2. Bowman, S.R., Gauthier, J., Rastogi, A., Gupta, R., Manning, C.D., Potts, C.: A fast unified model for parsing and sentence understanding. arXiv preprint: arXiv:1603.06021 (2016)

  3. Braud, C., Coavoux, M., Søgaard, A.: Cross-lingual RST discourse parsing. arXiv preprint: arXiv:1701.02946 (2017)

  4. Braud, C., Plank, B., Søgaard, A.: Multi-view and multi-task training of RST discourse parsers. In: Proceedings of COLING 2016, pp. 1903–1913 (2016)

    Google Scholar 

  5. Feng, V.W., Hirst, G.: A linear-time bottom-up discourse parser with constraints and post-editing. In: Proceedings of the 52nd ACL, vol. 1, pp. 511–521 (2014)

    Google Scholar 

  6. Hayashi, K., Hirao, T., Nagata, M.: Empirical comparison of dependency conversions for RST discourse trees. In: Proceedings of the 17th SIGDIAL, pp. 128–136 (2016)

    Google Scholar 

  7. Ji, Y., Eisenstein, J.: Representation learning for text-level discourse parsing. In: Proceedings of the 52nd ACL, vol. 1, pp. 13–24 (2014)

    Google Scholar 

  8. Joty, S., Carenini, G., Ng, R.T.: Codra: a novel discriminative framework for rhetorical analysis. Comput. Linguist. 41(3), 385–435 (2015)

    Article  MathSciNet  Google Scholar 

  9. Li, J., Li, R., Hovy, E.: Recursive deep models for discourse parsing. In: Proceedings of EMNLP 2014, pp. 2061–2069 (2014)

    Google Scholar 

  10. Li, Q., Li, T., Chang, B.: Discourse parsing with attention-based hierarchical neural networks. In: Proceedings of EMNLP 2016, pp. 362–371 (2016)

    Google Scholar 

  11. Mann, W.C., Thompson, S.A.: Rhetorical structure theory: toward a functional theory of text organization. Text Interdiscip. J. Study Discourse 8(3), 243–281 (1988)

    Article  Google Scholar 

  12. Morey, M., Muller, P., Asher, N.: A dependency perspective on RST discourse parsing and evaluation. Comput. Linguist. 44, 198–235 (2018)

    Article  MathSciNet  Google Scholar 

  13. Peters, M.E., et al.: Deep contextualized word representations. arXiv preprint: arXiv:1802.05365 (2018)

  14. Zhu, X., Sobihani, P., Guo, H.: Long short-term memory over recursive structures. In: ICML 2015, pp. 1604–1612 (2015)

    Google Scholar 

Download references

Acknowledgements

This work is supported by Artificial Intelligence Emergency Project 61751206 under the National Natural Science Foundation of China, and Project 61876118 under the National Natural Science Foundation of China.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fang Kong .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhang, L., Tan, X., Kong, F., Zhou, G. (2019). A Recursive Information Flow Gated Model for RST-Style Text-Level Discourse Parsing. In: Tang, J., Kan, MY., Zhao, D., Li, S., Zan, H. (eds) Natural Language Processing and Chinese Computing. NLPCC 2019. Lecture Notes in Computer Science(), vol 11839. Springer, Cham. https://doi.org/10.1007/978-3-030-32236-6_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-32236-6_20

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-32235-9

  • Online ISBN: 978-3-030-32236-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics