skip to main content
research-article

TPoet: Topic-Enhanced Chinese Poetry Generation

Published: 19 June 2023 Publication History

Abstract

Chinese poetry generation has been a challenging part of natural language processing due to the unique literariness and aesthetics of poetry. In most cases, the content of poetry is topic related. In other words, specific thoughts or emotions are usually expressed regarding given topics. However, topic information is rarely taken into consideration in current studies about poetry generation models. In this article, we propose a topic-enhanced Chinese poetry generation model called TPoet in which the topic model is integrated into the Transformer-based auto-regressive text generation model. By feeding topic information to the input layer and heterogeneous attention mechanism, TPoet can implicitly learn the latent information of topic distribution. In addition, by setting multiple identifiers such as segment, rhyme, and tone, the model can explicitly learn the constraints of generated poems. Extensive experimental results show that the quality of TPoet-generated poems outperforms the current advanced models or systems, and the topic consistency and diversity in generated poems have been significantly improved as well.

References

[1]
David M. Blei, Andrew Y. Ng, and Michael I. Jordan. 2003. Latent Dirichlet allocation. Journal of Machine Learning Research 3 (2003), 993–1022.
[2]
Huimin Chen, Xiaoyuan Yi, Maosong Sun, Wenhao Li, Cheng Yang, and Zhipeng Guo. 2019. Sentiment-controllable Chinese poetry generation. In Proceedings of the 28th International Joint Conference on Artificial Intelligence. 4925–4931.
[3]
Liming Deng, Jie Wang, Hang-Ming Liang, Hui Chen, Zhiqiang Xie, Bojin Zhuang, Shaojun Wang, and Jing Xiao. 2020. An iterative polishing framework based on quality aware masked language model for Chinese poetry generation. In Proceedings of the 24th AAAI Conference on Artificial Intelligence. 7643–7650. https://aaai.org/ojs/index.php/AAAI/article/view/6265.
[4]
Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2019. BERT: Pre-training of deep bidirectional transformers for language understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). 4171–4186.
[5]
Editorial Department of Chinese Poetry. 2004. Chinese new rhyme (the fourteen rhymes). Chinese Poetry 5 (2004), 38–51.
[6]
Angela Fan, Mike Lewis, and Yann Dauphin. 2018. Hierarchical neural story generation. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 889–898.
[7]
Jing He, Ming Zhou, and Long Jiang. 2012. Generating Chinese classical poems with statistical machine translation models. In Proceedings of the 26th AAAI Conference on Artificial Intelligence. 1650–1656.
[8]
Diederik P. Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).
[9]
Jiwei Li, Michel Galley, Chris Brockett, Jianfeng Gao, and Bill Dolan. 2016. A diversity-promoting objective function for neural conversation models. In Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 110–119.
[10]
Juntao Li, Yan Song, Haisong Zhang, Dongmin Chen, Shuming Shi, Dongyan Zhao, and Rui Yan. 2018. Generating classical Chinese poems via conditional variational autoencoder and adversarial training. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 3890–3900.
[11]
Piji Li, Haisong Zhang, Xiaojiang Liu, and Shuming Shi. 2020. Rigid formats controlled text generation. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 742–751.
[12]
Alec Radford, Karthik Narasimhan, Tim Salimans, and Ilya Sutskever. 2018. Improving Language Understanding by Generative Pre-Training. Technical Report. OpenAI.
[13]
Yan Rui, Jiang Han, Mirella Lapata, Shou De Lin, Xueqiang Lv, and Xiaoming Li. 2013. i, Poet: Automatic Chinese poetry composition through a generative summarization framework under constrained optimization. In Proceedings of the 22nd International Joint Conference on Artificial Intelligence. 2197–2203.
[14]
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. In Proceedings of the 31st Conference on Neural Information Processing Systems (NIPS’17). 5998–6008.
[15]
Zhe Wang, Wei He, Hua Wu, Haiyang Wu, Wei Li, Haifeng Wang, and Enhong Chen. 2016. Chinese poetry generation with planning based neural network. In Proceedings of the 26th International Conference on Computational Linguistics: Technical Papers. 1051–1060.
[16]
Xiaopeng Yang, Xiaowen Lin, Shunda Suo, and Ming Li. 2018. Generating thematic Chinese poetry using conditional variational autoencoders with hybrid decoders. In Proceedings of the 27th International Joint Conference on Artificial Intelligence. 4539–4545.
[17]
Xiaoyuan Yi, Ruoyu Li, and Maosong Sun. 2016. Generating Chinese classical poems with RNN encoder-decoder. arXiv preprint arXiv:1604.01537 (2016).
[18]
Xiaoyuan Yi, Maosong Sun, Ruoyu Li, and Wenhao Li. 2018. Automatic poetry generation with mutual reinforcement learning. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 3143–3153.
[19]
Jiyuan Zhang, Yang Feng, Dong Wang, Yang Wang, Andrew Abel, Shiyue Zhang, and Andi Zhang. 2017. Flexible and creative Chinese poetry generation using neural memory. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. 1364–1373.
[20]
Xingxing Zhang and Mirella Lapata. 2014. Chinese poetry generation with recurrent neural networks. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. 670–680.
[21]
Changle Zhou, Wei You, and Xiaojun Ding. 2010. Genetic algorithm and its implementation of automatic generation of Chinese SONGCI. Journal of Software 21, 3 (2010), 427–437.

Cited By

View all
  • (2024)Survey on Deep Learning Applications in Automated Chinese Poetry Composition2024 5th International Conference on Artificial Intelligence and Computer Engineering (ICAICE)10.1109/ICAICE63571.2024.10864023(662-666)Online publication date: 8-Nov-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Asian and Low-Resource Language Information Processing
ACM Transactions on Asian and Low-Resource Language Information Processing  Volume 22, Issue 6
June 2023
635 pages
ISSN:2375-4699
EISSN:2375-4702
DOI:10.1145/3604597
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 19 June 2023
Online AM: 21 April 2023
Accepted: 02 April 2023
Revised: 20 January 2023
Received: 26 July 2022
Published in TALLIP Volume 22, Issue 6

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Deep learning
  2. poetry generation
  3. topic model

Qualifiers

  • Research-article

Funding Sources

  • National Natural Science Foundation of China
  • Major Science and Technology Projects of Yunnan Province
  • Inner Mongolia Science Foundation

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)123
  • Downloads (Last 6 weeks)10
Reflects downloads up to 28 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Survey on Deep Learning Applications in Automated Chinese Poetry Composition2024 5th International Conference on Artificial Intelligence and Computer Engineering (ICAICE)10.1109/ICAICE63571.2024.10864023(662-666)Online publication date: 8-Nov-2024

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media