Abstract
Data-to-text generation task aims at generating text from structured data. In this work, we focus on a relatively new and challenging equation-to-text generation task – generating math word problems from equations and propose a novel equation-to-problem text generation model. Our model first utilizes a template-aware equation encoder and a Variational AutoEncoder (VAE) model to bridge the gap between abstract math tokens and text. We then introduce a topic selector and a topic controller to prevent topic drifting problems. To avoid the commonsense violation issues, we design a pre-training stage together with a commonsense enforcement mechanism. We construct a dataset to evaluate our model through both automatic metrics and human evaluation. Experiments show that our model significantly outperforms baseline models. Further analysis shows our model is effective in tackling topic drifting and commonsense violation problems.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Blei, D.M., Ng, A.Y., Jordan, M.I.: Latent Dirichlet allocation. J. Mach. Learn. Res. (2003)
Bowman, S.R., Vilnis, L., Vinyals, O., Dai, A.M., Jozefowicz, R., Bengio, S.: Generating sentences from a continuous space. In: SIGNLL. ACL (2016)
Chen, S., Wang, J., Feng, X., Jiang, F., Qin, B., Lin, C.Y.: Enhancing neural data-to-text generation models with external background knowledge. In: EMNLP-IJCNLP. Association for Computational Linguistics (2019)
Cui, L., Wu, Y., Liu, S., Zhang, Y., Zhou, M.: Mutual: a dataset for multi-turn dialogue reasoning (2020)
Dauphin, Y.N., Fan, A., Auli, M., Grangier, D.: Language modeling with gated convolutional networks. arXiv, Computation and Language (2016)
Gong, L., Crego, J., Senellart, J.: Enhanced transformer model for data-to-text generation. In: Proceedings of the 3rd Workshop on Neural Generation and Translation. Association for Computational Linguistics (2019)
Gyawali, B., Gardent, C.: Surface realisation from knowledge-bases. In: ACL. The Association for Computer Linguistics (2014)
Lin, C.: Rouge: a package for automatic evaluation of summaries, pp. 74–81 (2004)
Papineni, K., Roukos, S., Ward, T., Zhu, W.: Bleu: a method for automatic evaluation of machine translation. In: ACL. ACL (2002)
Puduppully, R., Dong, L., Lapata, M.: Data-to-text generation with content selection and planning. In: AAAI. AAAI Press (2019)
See, A., Liu, P.J., Manning, C.D.: Get to the point: summarization with pointer-generator networks. In: SIGNLL. Association for Computational Linguistics (2017)
Velickovic, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y.: Graph attention networks. arXiv, Machine Learning (2017)
Wiseman, S., Shieber, S.M., Rush, A.M.: Challenges in data-to-document generation. In: EMNLP. Association for Computational Linguistics (2017)
Zhao, C., Walker, M., Chaturvedi, S.: Bridging the structural gap between coding and decoding for data-to-text generation. In: ACL. Association for Computational Linguistics (2020)
Zhou, Q., Huang, D.: Towards generating math word problems from equations and topics. In: INLG. Association for Computational Linguistics (2019)
Huang, D., Shi, S., Lin, C.-Y., Yin, J., Ma, W.-Y.: How well do computers solve math word problems? Large-scale dataset construction and evaluation. In: ACL. Association for Computational Linguistics (2016)
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv, Learning (2014)
Kushman, N., Artzi, Y., Zettlemoyer, L., Barzilay, R.: Learning to automatically solve algebra word problems. In: ACL, pp. 271–281. Association for Computational Linguistics (2014)
Shi, S., Wang, Y., Lin, C.-Y., Liu, X., Rui, Y.: Automatically solving number word problems by semantic parsing and reasoning. In: EMNLP. Association for Computational Linguistics (2015)
Upadhyay, S., Chang, M.-W.: Annotating derivations: a new evaluation strategy and dataset for algebra word problems. In: EACL. Association for Computational Linguistics, April 2017
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Cao, T., Zeng, S., Zhao, S., Mansur, M., Chang, B. (2021). Generating Math Word Problems from Equations with Topic Consistency Maintaining and Commonsense Enforcement. In: Farkaš, I., Masulli, P., Otte, S., Wermter, S. (eds) Artificial Neural Networks and Machine Learning – ICANN 2021. ICANN 2021. Lecture Notes in Computer Science(), vol 12893. Springer, Cham. https://doi.org/10.1007/978-3-030-86365-4_6
Download citation
DOI: https://doi.org/10.1007/978-3-030-86365-4_6
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-86364-7
Online ISBN: 978-3-030-86365-4
eBook Packages: Computer ScienceComputer Science (R0)