Abstract
Abstract Meaning Representation (AMR) parsing attempts to extract a structured representation of a sentence’s meaning. This paper enhances an existing processing pipeline for AMR parsing, inspired by state-of-the-art solutions in dependency parsing. It enhances the existing Concept Identification module by using Pointer-Generator Networks. A further considerable improvement of this module is brought by the use of embeddings. An alternative approach is provided through Transformers, an architecture which needs large data-sets to accurately predict concepts. For predicting the relations between concepts, the proposed pipeline combines the two Heads Selection and now trainable Arcs Labelling tasks into a joint Relation Identification module, which enhances the overall performance of edge prediction. The improvements made to this AMR parser have resulted in a completely trainable model that can be improved further with end-to-end training.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014)
Banarescu, L., et al.: Abstract meaning representation for sembanking. In: Proceedings of the 7th Linguistic Annotation Workshop and Interoperability with Discourse, pp. 178–186. Association for Computational Linguistics, Sofia, Bulgaria (2013). https://www.aclweb.org/anthology/W13-2322
Batiz, O.B., Helmer, R.P., Pop, R., Macicasan, F., Lemnaru, C.: Concept identification with sequence-to-sequence models in abstract meaning representation parsing. In: 2020 IEEE 16th International Conference on Intelligent Computer Communication and Processing (ICCP), pp. 83–90 (2020)
Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the ACL: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota, June 2019
Dozat, T., Manning, C.D.: Deep biaffine attention for neural dependency parsing. CoRR abs/1611.01734 (2016). http://arxiv.org/abs/1611.01734
Dozat, T., Manning, C.D.: Simpler but more accurate semantic dependency parsing. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 484–490. Association for Computational Linguistics, Melbourne, Australia, July 2018
Kolomiyets, O., Bethard, S., Moens, M.F.: Extracting narrative timelines as temporal dependency structures. In: Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 88–97. Association for Computational Linguistics, Jeju Island, Korea, July 2012
Konstas, I., Iyer, S., Yatskar, M., Choi, Y., Zettlemoyer, L.: Neural AMR: sequence-to-sequence models for parsing and generation. In: Proceedings of the 55th Annual Meeting of the ACL (Volume 1: Long Papers), pp. 146–157. Association for Computational Linguistics, Vancouver, Canada, July 2017
Liu, L., Liu, X., Gao, J., Chen, W., Han, J.: Understanding the difficulty of training transformers. CoRR abs/2004.08249 (2020). https://arxiv.org/abs/2004.08249
Miao, Y., Blunsom, P.: Language as a latent variable: Discrete generative models for sentence compression. Empirical Methods in Natural Language Processings (2016)
van Noord, R., Toral, A., Bos, J.: Character-level representations improve DRS-based semantic parsing even in the age of BERT. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, pp. 4587–4603, November 2020
Pennington, J., Socher, R., Manning, C.D.: Glove: Global vectors for word representation. In: EMNLP, vol. 14, pp. 1532–1543 (2014)
Pop, R., Macicasan, F., Lemnaru, C.: A two stage approach for AMR parsing using the concept inference order. In: 2020 IEEE 16th International Conference on Intelligent Computer Communication and Processing (ICCP), pp. 91–98 (2020)
Popel, M., Bojar, O.: Training tips for the transformer model. CoRR abs/1804.00247 (2018)
See, A., Liu, P.J., Manning, C.D.: Get to the point: summarization with pointer-generator networks. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, vol. 1, pp. 1073–1083 (2017)
Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30. Curran Associates, Inc. (2017)
Vinyals, O., Fortunato, M., Jaitly, N.: Pointer networks. In: Cortes, C., Lawrence, N., Lee, D., Sugiyama, M., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 28. Curran Associates, Inc. (2015)
Wang, C., Xue, N., Pradhan, S.: A transition-based algorithm for AMR parsing. In: Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 366–375. Association for Computational Linguistics, Denver, Colorado, May-Jun 2015
Zhang, X., Cheng, J., Lapata, M.: Dependency parsing as head selection. In: Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers, pp. 665–676. Association for Computational Linguistics, Valencia, Spain, Apr 2017
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 Springer Nature Switzerland AG
About this paper
Cite this paper
Frasie, A., Vezan, NT., Marian, G., Macicasan, F., Lemnaru, C. (2022). Enhancements on a Pipeline Approach for Abstract Meaning Representation Parsing. In: Iliadis, L., Jayne, C., Tefas, A., Pimenidis, E. (eds) Engineering Applications of Neural Networks. EANN 2022. Communications in Computer and Information Science, vol 1600. Springer, Cham. https://doi.org/10.1007/978-3-031-08223-8_41
Download citation
DOI: https://doi.org/10.1007/978-3-031-08223-8_41
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-08222-1
Online ISBN: 978-3-031-08223-8
eBook Packages: Computer ScienceComputer Science (R0)