Abstract
With the rapid development of deep learning in the fields of text abstraction and dialogue generation, researchers are now reconsidering the long-standing story-generation task from the 1970s. Deep learning methods are gradually being adopted to solve problems in traditional story generation, making story generation a new research hotspot in the field of text generation. However, in the field of story generation, the widely used seq2seq model is unable to provide adequate long-distance text modeling. As a result, this model struggles to solve the story-generation task, since the relation between long text should be considered, and coherency and vividness are critical. Thus, recent years have seen numerous proposals for better modeling methods. In this paper, we present the results of a comprehensive study of story generation. We first introduce the relevant concepts of story generation, its background, and the current state of research. We then summarize and analyze the standard methods of story generation. Based on various divisions of user constraints, the story-generation methods are divided into three categories: the theme-oriented model, the storyline-oriented model, and the human-machine-interaction–oriented model. On this basis, we discuss the basic ideas and main concerns of various methods and compare the strengths and weaknesses of each method. Finally, we finish by analyzing and forecasting future developments that could push story-generation research toward a new frontier.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsChange history
04 November 2019
In the version of this paper that was originally published, the last name of the author Sisi Xuanyuan was misspelt. This has been corrected.
References
Meehan, J.R.: TALE-SPIN, an interactive program that writes stories. IJCAI 77, 91–98 (1977)
Zhu, F., Cungen, C.: A survey of narrative generation approaches. J. Chin. Inf. Process. 27(3), 33–40 (2013)
Cavazza, M., Charles, F., Mead, S.J.: Character-based interactive storytelling. IEEE Intell. Syst. 17(4), 17–24 (2002)
Pizzi, D.: Emotional planning for character-based interactive storytelling. Teesside University (2011)
Lebowitz, M.: Story-telling as planning and learning. Poetics 14(6), 483–502 (1985)
Turner, S.R.: MINSTREL: a computer model of creativity and storytelling, pp. 1505–1505 (1994)
Bringsjord, S., Ferrucci, D.: Artificial Intelligence and Literary Creativity: Inside the Mind of Brutus, a Storytelling Machine. Psychology Press, Abingdon (1999)
Perez, R.P.Ý., Sharples, M.: MEXICA: a computer model of a cognitive account of creative writing. J. Exp. Theor. Artif. Intell. 13(2), 119–139 (2001)
Riedl, M.O., Young, R.M.: Narrative planning: balancing plot and character. J. Artif. Intell. Res. 39(1), 217–268 (2010)
Gervas, P., et al.: Story plot generation based on CBR. Knowl. Based Syst. 18(4), 235–242 (2005)
Swanson, R., Gordon, A.S.: Say anything: using textual case-based reasoning to enable open-domain interactive storytelling. Ksii Trans. Internet Inf. Syst. 2(3), 16 (2012)
Li, B., et al.: Story generation with crowdsourced plot graphs. In: National Conference on Artificial Intelligence, pp. 598–604 (2013)
Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Neural Information Processing Systems, pp. 3104–3112 (2014)
Luong, T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. In: Empirical Methods in Natural Language Processing, pp. 1412–1421 (2015)
Martin, L.J., et al.: Event representations for automated story generation with deep neural nets. In: National Conference on Artificial Intelligence, pp. 868–875 (2018)
Li, J., et al.: Deep reinforcement learning for dialogue generation. In: Empirical Methods in Natural Language Processing, pp. 1192–1202 (2016)
Yao, L., et al: Plan-and-write: towards better automatic storytelling. In: National Conference on Artificial Intelligence (2019)
Guan, J., Wang, Y., Huang, M.: Story ending generation with incremental encoding and commonsense knowledge. In: National Conference on Artificial Intelligence (2019)
Tambwekar, P., et al.: Controllable neural story generation via reinforcement learning. arXiv: Computation and Language (2018)
Fan, A., Lewis, M., Dauphin, Y.N.: Hierarchical neural story generation. In: Meeting of the Association for Computational Linguistics, pp. 889–898 (2018)
Xu, J., et al.: A skeleton-based model for promoting coherence among sentences in narrative story generation. In: Empirical Methods in Natural Language Processing, pp. 4306–4315 (2018)
Nayak, N., et al.: To plan or not to plan? Discourse planning in slot-value informed sequence to sequence models for language generation. In: Conference of the International Speech Communication Association, pp. 3339–3343 (2017)
Wang, Z., et al.: Chinese poetry generation with planning based neural network. In: International Conference on Computational Linguistics, pp. 1051–1060 (2016)
Mou, L., et al.: Sequence to backward and forward sequences: a content-introducing approach to generative short-text conversation. In: International Conference on Computational Linguistics, pp. 3349–3358 (2016)
Fan, A., Lewis, M., Dauphin, Y.: Strategies for structuring story generation. arXiv: Computation and Language (2019)
Huang, K., et al.: Visual storytelling. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1233–1239 (2016)
Wang, X., et al.: No metrics are perfect: adversarial reward learning for visual storytelling. In: Meeting of the Association for Computational Linguistics, pp. 899–909 (2018)
Kim, T., et al.: GLAC Net: GLocal attention cascading networks for multi-image cued story generation. arXiv: Computation and Language (2018)
Huang, Q., et al.: Hierarchically structured reinforcement learning for topically coherent visual story generation. In: National Conference on Artificial Intelligence (2019)
Li, J., Luong, T., Jurafsky, D.: A hierarchical neural autoencoder for paragraphs and documents. In: International Joint Conference on Natural Language Processing, pp. 1106–1115 (2015)
Yarats, D., Lewis, M.: Hierarchical text generation and planning for strategic dialogue. In: International Conference on Machine Learning, pp. 5587–5595 (2018)
Mostafazadeh, N., et al.: CaTeRS: causal and temporal relation scheme for semantic annotation of event structures. In: North American Chapter of the Association for Computational Linguistics, pp. 51–61 (2016)
Zhou, D., Guo, L., He, Y.: Neural storyline extraction model for storyline generation from news articles. In: North American Chapter of the Association for Computational Linguistics, pp. 1727–1736 (2018)
Jain, P., et al.: Story generation from sequence of independent short descriptions. arXiv: Computation and Language (2017)
Zhao, Y., et al.: From plots to endings: a reinforced pointer generator for story ending generation. In: International Conference Natural Language Processing, pp. 51–63 (2018)
Clark, E.A., et al.: Creative writing with a machine in the loop: case studies on slogans and stories. In: Intelligent User Interfaces, pp. 329–340 (2018)
Peng, N., et al.: Towards controllable story generation. In: Proceedings of the First Workshop on Storytelling (2018)
Goldfarb-Tarrant, S., Feng, H., Peng, N.: Plan, write, and revise: an interactive system for open-domain story generation. arXiv: Computation and Language (2019)
Acknowledgment
This work is supported by the National Key Research and Development Program of China (No. 2017YFB1400805).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Hou, C., Zhou, C., Zhou, K., Sun, J., Xuanyuan, S. (2019). A Survey of Deep Learning Applied to Story Generation. In: Qiu, M. (eds) Smart Computing and Communication. SmartCom 2019. Lecture Notes in Computer Science(), vol 11910. Springer, Cham. https://doi.org/10.1007/978-3-030-34139-8_1
Download citation
DOI: https://doi.org/10.1007/978-3-030-34139-8_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-34138-1
Online ISBN: 978-3-030-34139-8
eBook Packages: Computer ScienceComputer Science (R0)