Skip to main content

The Book of Endless History: Authorial Use of GPT2 for Interactive Storytelling

  • Conference paper
  • First Online:
Interactive Storytelling (ICIDS 2019)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 11869))

Included in the following conference series:

Abstract

We present The Book of Endless History, an infinite Wikipedia of fantasy stories written in the style of Borges and Calvino, exploring the use of structural conditioning on GPT2 to generate text with explicit subjects and embedded web-links. Users are presented with a Wikipedia-like interface, containing a short fantasy description of the topic and containing embedded web-links to other related subjects. Users may click on links to learn more about different topics, following an endless trail of generated pages. The GPT2 architecture is a text completion model – it has no explicit understanding of structure and it can be a challenge to integrate it with authorial intent. Nevertheless, through this work we show that it can be conditioned to learn to write about a subject and additionally to generate the topology for an encyclopedia. We refer to this technique as subject conditioning, or more generally, structural conditioning.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    As of the writing of this paper, GPT2 comes in two sizes: 345m and 117m, both distilled from an unreleased 1558M model trained by OpenAI. The 345 m model, while slower and larger, is much more robust than the 117m.

  2. 2.

    The training took approximately 4–6 hours on an NVidia T4, provided for free by Google Colab.

  3. 3.

    As you may notice, this causes our text to be double spaced, with subjects in between real lines. However, GPT2 manages to learn to ignore this structure incredibly quickly.

  4. 4.

    http://bookofendlesshistory.com.

References

  1. Honnibal, M., Montani, I.: spaCy 2: natural language understanding with bloom embeddings, convolutional neural networks and incremental parsing (2017, to appear)

    Google Scholar 

  2. Mordvintsev, A., Olah, C., Tyka, M.: Inceptionism: going deeper into neural networks, Blog (2015). https://ai.googleblog.com/2015/06/inceptionism-going-deeper-into-neural.html

  3. Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., Sutskever, I.: Language models are unsupervised multitask learners. OpenAI (2019)

    Google Scholar 

  4. Zellers, R., et al.: Defending against neural fake news. CoRR abs/1905.12616 (2019). http://arxiv.org/abs/1905.12616

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to John Austin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Austin, J. (2019). The Book of Endless History: Authorial Use of GPT2 for Interactive Storytelling. In: Cardona-Rivera, R., Sullivan, A., Young, R. (eds) Interactive Storytelling. ICIDS 2019. Lecture Notes in Computer Science(), vol 11869. Springer, Cham. https://doi.org/10.1007/978-3-030-33894-7_47

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-33894-7_47

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-33893-0

  • Online ISBN: 978-3-030-33894-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics