Skip to main content

Generating Math Word Problems from Equations with Topic Consistency Maintaining and Commonsense Enforcement

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 12893))

Abstract

Data-to-text generation task aims at generating text from structured data. In this work, we focus on a relatively new and challenging equation-to-text generation task – generating math word problems from equations and propose a novel equation-to-problem text generation model. Our model first utilizes a template-aware equation encoder and a Variational AutoEncoder (VAE) model to bridge the gap between abstract math tokens and text. We then introduce a topic selector and a topic controller to prevent topic drifting problems. To avoid the commonsense violation issues, we design a pre-training stage together with a commonsense enforcement mechanism. We construct a dataset to evaluate our model through both automatic metrics and human evaluation. Experiments show that our model significantly outperforms baseline models. Further analysis shows our model is effective in tackling topic drifting and commonsense violation problems.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    https://github.com/caotianyang/math2textcs1.

  2. 2.

    https://github.com/commonsense/conceptnet5.

References

  1. Blei, D.M., Ng, A.Y., Jordan, M.I.: Latent Dirichlet allocation. J. Mach. Learn. Res. (2003)

    Google Scholar 

  2. Bowman, S.R., Vilnis, L., Vinyals, O., Dai, A.M., Jozefowicz, R., Bengio, S.: Generating sentences from a continuous space. In: SIGNLL. ACL (2016)

    Google Scholar 

  3. Chen, S., Wang, J., Feng, X., Jiang, F., Qin, B., Lin, C.Y.: Enhancing neural data-to-text generation models with external background knowledge. In: EMNLP-IJCNLP. Association for Computational Linguistics (2019)

    Google Scholar 

  4. Cui, L., Wu, Y., Liu, S., Zhang, Y., Zhou, M.: Mutual: a dataset for multi-turn dialogue reasoning (2020)

    Google Scholar 

  5. Dauphin, Y.N., Fan, A., Auli, M., Grangier, D.: Language modeling with gated convolutional networks. arXiv, Computation and Language (2016)

    Google Scholar 

  6. Gong, L., Crego, J., Senellart, J.: Enhanced transformer model for data-to-text generation. In: Proceedings of the 3rd Workshop on Neural Generation and Translation. Association for Computational Linguistics (2019)

    Google Scholar 

  7. Gyawali, B., Gardent, C.: Surface realisation from knowledge-bases. In: ACL. The Association for Computer Linguistics (2014)

    Google Scholar 

  8. Lin, C.: Rouge: a package for automatic evaluation of summaries, pp. 74–81 (2004)

    Google Scholar 

  9. Papineni, K., Roukos, S., Ward, T., Zhu, W.: Bleu: a method for automatic evaluation of machine translation. In: ACL. ACL (2002)

    Google Scholar 

  10. Puduppully, R., Dong, L., Lapata, M.: Data-to-text generation with content selection and planning. In: AAAI. AAAI Press (2019)

    Google Scholar 

  11. See, A., Liu, P.J., Manning, C.D.: Get to the point: summarization with pointer-generator networks. In: SIGNLL. Association for Computational Linguistics (2017)

    Google Scholar 

  12. Velickovic, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y.: Graph attention networks. arXiv, Machine Learning (2017)

    Google Scholar 

  13. Wiseman, S., Shieber, S.M., Rush, A.M.: Challenges in data-to-document generation. In: EMNLP. Association for Computational Linguistics (2017)

    Google Scholar 

  14. Zhao, C., Walker, M., Chaturvedi, S.: Bridging the structural gap between coding and decoding for data-to-text generation. In: ACL. Association for Computational Linguistics (2020)

    Google Scholar 

  15. Zhou, Q., Huang, D.: Towards generating math word problems from equations and topics. In: INLG. Association for Computational Linguistics (2019)

    Google Scholar 

  16. Huang, D., Shi, S., Lin, C.-Y., Yin, J., Ma, W.-Y.: How well do computers solve math word problems? Large-scale dataset construction and evaluation. In: ACL. Association for Computational Linguistics (2016)

    Google Scholar 

  17. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv, Learning (2014)

    Google Scholar 

  18. Kushman, N., Artzi, Y., Zettlemoyer, L., Barzilay, R.: Learning to automatically solve algebra word problems. In: ACL, pp. 271–281. Association for Computational Linguistics (2014)

    Google Scholar 

  19. Shi, S., Wang, Y., Lin, C.-Y., Liu, X., Rui, Y.: Automatically solving number word problems by semantic parsing and reasoning. In: EMNLP. Association for Computational Linguistics (2015)

    Google Scholar 

  20. Upadhyay, S., Chang, M.-W.: Annotating derivations: a new evaluation strategy and dataset for algebra word problems. In: EACL. Association for Computational Linguistics, April 2017

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Baobao Chang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Cao, T., Zeng, S., Zhao, S., Mansur, M., Chang, B. (2021). Generating Math Word Problems from Equations with Topic Consistency Maintaining and Commonsense Enforcement. In: Farkaš, I., Masulli, P., Otte, S., Wermter, S. (eds) Artificial Neural Networks and Machine Learning – ICANN 2021. ICANN 2021. Lecture Notes in Computer Science(), vol 12893. Springer, Cham. https://doi.org/10.1007/978-3-030-86365-4_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-86365-4_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-86364-7

  • Online ISBN: 978-3-030-86365-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics