Abstract
Automating the process of lyric generation should face the challenge of being meaningful and semantically related to a scenario. Traditional keyword or template based lyric generation systems always ignore the patterns and styles of lyricists, which suffer from improper lyric construction and maintenance. A Chinese lyric generation system is proposed to learn patterns and styles of certain lyricists and generate lyrics automatically. A long short-term memory network is utilized to process each lyric line and generate the next line word by word. A hierarchical attention model is designed to capture the contextual information at both sentence and document level, which could learn high level representations of each lyric line and the entire document. Furthermore, the LSTM decoder decodes all the semantic contextual information into lyric lines word by word. The results of the automatically generated lyrics show that the proposed method can correctly capture the patterns and styles of a certain lyricist and fit into certain scenarios, which also outperforms state-of-the-art models.



Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Barbieri G, Pachet F, Roy P et al. (2012) Markov constraints for generating lyrics with style[C], pp 115–120
Sridhar R, Ganga K, Prabha GD (2014) Automatic Tamil lyric generation based on ontological interpretation for semantics[J]. Sadhana 39(1):97–121
Addanki K, Wu D (2013) Unsupervised rhyme scheme identification in hip hop lyrics using hidden Markov models[C]. In: International conference on statistical language and speech processing, pp 39–50
Watanabe K, Matsubayashi Y, Inui K et al (2014) Modeling structural topic transitions for automatic lyrics generation[C].. In: Proceedings of the 28th Pacific Asia conference on language, information and computing, pp. 422–431
Watanabe K, Matsubayashi Y, Inui K et al. (2017) Lyrisys: an interactive support system for writing lyrics based on topic transition[C]. In: Proceedings of the 22nd international conference on intelligent user interfaces, pp 559–563
LeCun Y, Bengio Y, Hinton G (2015) Deep learning[J]. Nature 521(7553):436–444
Mikolov T, Karafiát M, Burget L et al. (2010) Recurrent neural network based language model[C]. Interspeech 2:3
Sutskever I, Martens J, Hinton GE (2011) Generating text with recurrent neural networks[C], pp 1017–1024
Potash P, Romanov A, Rumshisky A (2015) Ghostwriter: using an LSTM for automatic RAP lyric generation[C], pp 1919–1924
Sundermeyer M, Schlüter R, Ney H (2012) LSTM Neural networks for language modeling[C]. In: Thirteenth annual conference of the international speech communication association
Zhang X, Lapata M (2014) Chinese poetry generation with recurrent neural networks[C]. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 670–680
Sutskever I, Vinyals O, Le QV (2014) Sequence to sequence learning with neural networks[C]. In: Advances in neural information processing systems, pp. 3104–3112
Bahdanau D, Cho K, Bengio Y (2014) Neural machine translation by jointly learning to align and translate[J], arXiv:1409.0473
Luong MT, Pham H, Manning CD (2015) Effective approaches to attention-based neural machine translation[J], arXiv:1508.04025
Hu Z, Yang Z, Liang X et al. (2017) Toward controlled generation of text[C]. In: International conference on machine learning, pp 1587–1596
Kawthekar P, Rewari R (2016) Bhooshan s evaluating generative models for text Generation[J]
Acknowledgements
This paper is supported by the project 61303094 supported by National Natural Science Foundation of China, by the Science and Technology Commission of Shanghai Municipality 16511102400, by Innovation Program of Shanghai Municipal Education Commission (14YZ024).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Wu, X., Du, Z., Guo, Y. et al. Hierarchical attention based long short-term memory for Chinese lyric generation. Appl Intell 49, 44–52 (2019). https://doi.org/10.1007/s10489-018-1206-2
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10489-018-1206-2