Abstract
We present The Book of Endless History, an infinite Wikipedia of fantasy stories written in the style of Borges and Calvino, exploring the use of structural conditioning on GPT2 to generate text with explicit subjects and embedded web-links. Users are presented with a Wikipedia-like interface, containing a short fantasy description of the topic and containing embedded web-links to other related subjects. Users may click on links to learn more about different topics, following an endless trail of generated pages. The GPT2 architecture is a text completion model – it has no explicit understanding of structure and it can be a challenge to integrate it with authorial intent. Nevertheless, through this work we show that it can be conditioned to learn to write about a subject and additionally to generate the topology for an encyclopedia. We refer to this technique as subject conditioning, or more generally, structural conditioning.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
As of the writing of this paper, GPT2 comes in two sizes: 345m and 117m, both distilled from an unreleased 1558M model trained by OpenAI. The 345 m model, while slower and larger, is much more robust than the 117m.
- 2.
The training took approximately 4–6 hours on an NVidia T4, provided for free by Google Colab.
- 3.
As you may notice, this causes our text to be double spaced, with subjects in between real lines. However, GPT2 manages to learn to ignore this structure incredibly quickly.
- 4.
References
Honnibal, M., Montani, I.: spaCy 2: natural language understanding with bloom embeddings, convolutional neural networks and incremental parsing (2017, to appear)
Mordvintsev, A., Olah, C., Tyka, M.: Inceptionism: going deeper into neural networks, Blog (2015). https://ai.googleblog.com/2015/06/inceptionism-going-deeper-into-neural.html
Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., Sutskever, I.: Language models are unsupervised multitask learners. OpenAI (2019)
Zellers, R., et al.: Defending against neural fake news. CoRR abs/1905.12616 (2019). http://arxiv.org/abs/1905.12616
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Austin, J. (2019). The Book of Endless History: Authorial Use of GPT2 for Interactive Storytelling. In: Cardona-Rivera, R., Sullivan, A., Young, R. (eds) Interactive Storytelling. ICIDS 2019. Lecture Notes in Computer Science(), vol 11869. Springer, Cham. https://doi.org/10.1007/978-3-030-33894-7_47
Download citation
DOI: https://doi.org/10.1007/978-3-030-33894-7_47
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-33893-0
Online ISBN: 978-3-030-33894-7
eBook Packages: Computer ScienceComputer Science (R0)