Skip to main content
Log in

Syntax-guided text generation via graph neural network

  • Research Paper
  • Published:
Science China Information Sciences Aims and scope Submit manuscript

Abstract

Text generation is a fundamental and important task in natural language processing. Most of the existing models generate text in a sequential manner and have difficulty modeling complex dependency structures. In this paper, we treat the text generation task as a graph generation problem exploiting both syntactic and word-ordering relationships. Leveraging the framework of the graph neural network, we propose the word graph model. During the process, the model builds a sentence incrementally and maintains syntactic integrity via a syntax-driven, top-down, breadth-first generation process. Experimental results on both synthetic and real text generation tasks show the efficacy of our approach.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Sordoni A, Galley M, Auli M, et al. A neural network approach to context-sensitive generation of conversational responses. In: Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (HLT-NAACL), Denver, 2015. 196–205

  2. Bahdanau D, Cho K H, Bengio Y. Neural machine translation by jointly learning to align and translate. In: Proceedings of the 5th International Conference on Learning Representations, 2015

  3. Serban I V, Sordoni A, Bengio Y, et al. Building end-to-end dialogue systems using generative hierarchical neural network models. In: Proceedings of the 30th AAAI Conference on Artificial Intelligence, 2016. 3776–3784

  4. Ranzato M A, Chopra S, Auli M, et al. Sequence level training with recurrent neural networks. In: Proceedings of the 4th International Conference on Learning Representations, 2016

  5. Wiseman S, Rush A M. Sequence-to-sequence learning as beam-search optimization. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing, Austin, 2016. 1296–1306

  6. Bowman S R, Vilnis L, Vinyals O, et al. Generating sentences from a continuous space. In: Proceedings of the SIGNLL Conference on Computational Natural Language Learning, Berlin, 2016. 10–21

  7. Hochreiter S, Schmidhuber J. Long short-term memory. Neural Comput, 1997, 9: 1735–1780

    Article  Google Scholar 

  8. Chung J Y, Gulcehre C, Cho K H, et al. Empirical evaluation of gated recurrent neural networks on sequence modeling. In: Proceedings of the Advances in Neural Information Processing Systems Deep Learning Workshop, 2014

  9. Henaff M, Burna J, LeCun Y. Deep convolutional networks on graph-structured data. 2015. ArXiv:1506.05163

  10. Kipf T N, Welling M. Semi-supervised classification with graph convolutional networks. In: Proceedings of the 5th International Conference on Learning Representations, Toulon, 2017

  11. Battaglia P W, Pascanu R, Lai M, et al. Interaction networks for learning about objects, relations and physics. In: Proceedings of the Thirtieth Conference on Neural Information Processing Systems, 2016. 4502–4510

  12. Gilmer J, Schoenholz S S, Riley P F, et al. Neural message passing for quantum chemistry. In: Proceedings of the 34th International Conference on Machine Learning, Sydney, 2017. 1263–1272

  13. Bengio Y, Louradour J, Collobert R, et al. Curriculum learning. In: Proceedings of the 26th International Conference on Machine Learning, Montreal, 2009. 41–48

  14. Bengio S, Vinyals O, Jaitly N, et al. Scheduled sampling for sequence prediction with recurrent neural networks. In: Proceedings of the 29th Conference on Neural Information Processing Systems, Montréal, 2015. 1171–1179

  15. Yu L T, Zhang W N, Wang J, et al. SeqGAN: sequence generative adversarial nets with policy gradient. In: Proceedings of the 31st AAAI Conference on Artificial Intelligence, 2017. 2852–2858

  16. Guo J X, Lu S D, Cai H, et al. Long text generation via adversarial training with leaked information. In: Proceedings of the 32nd AAAI Conference on Artificial Intelligence, 2018. 5141–5148

  17. Fedus W, Goodfellow I J, Dai A M. MaskGAN: better text generation via filling in the ____. In: Proceedings of the 6th International Conference on Learning Representations, Vancouver, 2018

  18. Diao Q M, Qiu M H, Wu C Y, et al. Jointly modeling aspects, ratings and sentiments for movie recommendation (JMARS). In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2014. 193–202

  19. Papineni K, Roukos S, Ward T, et al. BLEU: a method for automatic evaluation of machine translation. In: Proceedings of the 40th Annual Meeting on Association for Computational Linguistics, 2002. 311–318

  20. Wang K, Wan X J. SentiGAN: generating sentimental texts via mixture adversarial networks. In: Proceedings of the 27th International Joint Conference on Artificial Intelligence (IJCAI-18), 2018. 4446–4452

  21. Vinyals O, Kaiser L, Koo T, et al. Grammar as a foreign language. In: Proceedings of the Neural Information Processing Systems, 2015. 2773–2781

  22. Tai K S, Socher R, Manning C D. Improved semantic representations from tree-structured long short-term memory networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing, Beijing, 2015. 1556–1566

  23. Dyer C, Kuncoro A, Ballesteros M, et al. Recurrent neural network grammars. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, San Diego, 2016. 199–209

  24. Alvarez-Melis D, Jaakkola T S. Tree-structured decoding with doubly-recurrent neural networks. In: Proceedings of the International Conference on Learning Representations, 2017

  25. Zhou G B, Luo P, Cao R Y, et al. Tree-structured neural machine for linguistics-aware sentence generation. In: Proceedings of 32nd AAAI Conference on Artificial Intelligence, 2018. 5722–5729

  26. Scarselli F, Gori M, Ah Chung Tsoi M, et al. The graph neural network model. IEEE Trans Neural Netw, 2009, 20: 61–80

    Article  Google Scholar 

  27. Wu S Z, Zhang D D, Yang N, et al. Sequence-to-dependency neural machine translation. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, 2017. 698–707

  28. Li Y J, Tarlow D, Brockschmidt M, et al. Gated Graph Sequence Neural Networks. In: Proceedings of the 4th International Conference on Learning Representations, San Juan, 2016

Download references

Acknowledgements

This work was supported by National Key Research and Development Program of China (Grant No. 2018YFC0831103), Shanghai Municipal Science and Technology Major Project (Grant No. 2018SHZDZX01), and Zhejiang Lab.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xipeng Qiu.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Guo, Q., Qiu, X., Xue, X. et al. Syntax-guided text generation via graph neural network. Sci. China Inf. Sci. 64, 152102 (2021). https://doi.org/10.1007/s11432-019-2740-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11432-019-2740-1

Keywords

Navigation