Skip to main content

A Study on Self-attention Mechanism for AMR-to-text Generation

  • Conference paper
  • First Online:
Book cover Natural Language Processing and Information Systems (NLDB 2019)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 11608))

Abstract

Introduced by Vaswani et al., transformer architecture, with the effective use of self-attention mechanism, has shown outstanding performance in translating sequence of text from one language to another. In this paper, we conduct experiments using the self-attention in converting an abstract meaning representation (AMR) graph, a semantic representation, into a natural language sentence, also known as the task of AMR-to-text generation. On the benchmark dataset for this task, we obtain promising results comparing to existing deep learning methods in the literature.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 74.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Flanigan, J., Dyer, C., Smith, N.A., Carbonell, J.: Generation from abstract meaning representation using tree transducers. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 731–739. Association for Computational Linguistics (2016). https://doi.org/10.18653/v1/N16-1087, http://aclweb.org/anthology/N16-1087

  2. Gu, J., Lu, Z., Li, H., Li, V.O.: Incorporating copying mechanism in sequence-to-sequence learning. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, Long Papers, vol. 1, pp. 1631–1640. Association for Computational Linguistics (2016). https://doi.org/10.18653/v1/P16-1154, http://aclweb.org/anthology/P16-1154

  3. Jones, B., Andreas, J., Bauer, D., Moritz Hermann, K., Knight, K.: Semantics-based machine translation with hyperedge replacement grammars. In: 24th International Conference on Computational Linguistics - Proceedings of COLING 2012: Technical Papers, pp. 1359–1376, December 2012

    Google Scholar 

  4. Konstas, I., Iyer, S., Yatskar, M., Choi, Y., Zettlemoyer, L.: Neural AMR: Sequence-to-sequence models for parsing and generation. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, Long Papers, vol. 1, pp. 146–157. Association for Computational Linguistics (2017). https://doi.org/10.18653/v1/P17-1014, http://aclweb.org/anthology/P17-1014

  5. Banarescu, L., et al.: Abstract meaning representation for sembanking. In: Proceedings of the 7th Linguistic Annotation Workshop and Interoperability with Discourse, pp. 178–186 (2013)

    Google Scholar 

  6. Liu, F., Flanigan, J., Thomson, S., Sadeh, N., Smith, N.A.: Toward abstractive summarization using semantic representations. In: NAACL, pp. 1077–1086 (2015)

    Google Scholar 

  7. Papineni, K., Roukos, S., Ward, T., Zhu, W.J.: Bleu: a method for automatic evaluation of machine translation. In: Proceedings of 40th Annual Meeting of the Association for Computational Linguistics, pp. 311–318. Association for Computational Linguistics, Philadelphia, July 2002. https://doi.org/10.3115/1073083.1073135, https://www.aclweb.org/anthology/P02-1040

  8. Peng, X., Wang, C., Gildea, D., Xue, N.: Addressing the data sparsity issue in neural AMR parsing. In: Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics, Long Papers, vol. 1, pp. 366–375. Association for Computational Linguistics (2017), http://aclweb.org/anthology/E17-1035

  9. Pennington, J., Socher, R., Manning, C.: Glove: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543. Association for Computational Linguistics (2014). https://doi.org/10.3115/v1/D14-1162, http://aclweb.org/anthology/D14-1162

  10. Pourdamghani, N., Knight, K., Hermjakob, U.: Generating English from abstract meaning representations. In: Proceedings of the 9th International Natural Language Generation Conference, pp. 21–25. Association for Computational Linguistics (2016). https://doi.org/10.18653/v1/W16-6603, http://aclweb.org/anthology/W16-6603

  11. Rao, S., Marcu, D., Knight, K., Daumé III, H.: Biomedical event extraction using abstract meaning representation. BioNLP 2017, 126–135 (2017)

    Google Scholar 

  12. Sachan, M., Xing, E.: Machine comprehension using rich semantic representations. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, Short Papers, vol. 2, pp. 486–492. Association for Computational Linguistics (2016). https://doi.org/10.18653/v1/P16-2079, http://www.aclweb.org/anthology/P16-2079

  13. Shaw, P., Uszkoreit, J., Vaswani, A.: Self-attention with relative position representations. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, (Short Papers), vol. 2 pp. 464–468. Association for Computational Linguistics (2018). https://doi.org/10.18653/v1/N18-2074, http://aclweb.org/anthology/N18-2074

  14. Song, L., Peng, X., Zhang, Y., Wang, Z., Gildea, D.: AMR-to-text generation with synchronous node replacement grammar. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, Short Papers, vol. 2, pp. 7–13. Association for Computational Linguistics (2017). https://doi.org/10.18653/v1/P17-2002, http://aclweb.org/anthology/P17-2002

  15. Song, L., Zhang, Y., Peng, X., Wang, Z., Gildea, D.: AMR-to-text generation as a traveling salesman problem. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2084–2089. Association for Computational Linguistics (2016). https://doi.org/10.18653/v1/D16-1224, http://aclweb.org/anthology/D16-1224

  16. Song, L., Zhang, Y., Wang, Z., Gildea, D.: A graph-to-sequence model for AMR-to-text generation. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, Long Papers, vol. 1, pp. 1616–1626. Association for Computational Linguistics (2018). http://aclweb.org/anthology/P18-1150

  17. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 5998–6008 (2017)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Vu Trong Sinh or Nguyen Le Minh .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Sinh, V.T., Minh, N.L. (2019). A Study on Self-attention Mechanism for AMR-to-text Generation. In: Métais, E., Meziane, F., Vadera, S., Sugumaran, V., Saraee, M. (eds) Natural Language Processing and Information Systems. NLDB 2019. Lecture Notes in Computer Science(), vol 11608. Springer, Cham. https://doi.org/10.1007/978-3-030-23281-8_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-23281-8_27

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-23280-1

  • Online ISBN: 978-3-030-23281-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics