ABSTRACT
Automated generation of complex and diverse environments can be achieved through the use of Procedural Content Generation (PCG) algorithms. However, generating content that is both meaningful and reflective of specific intentions and constraints remains a challenge. Recent advances in Large Language Models (LLMs) have demonstrated their effectiveness in various domains. These models can be fine-tuned and information can be reused to accelerate training for new tasks. Our study presents MarioGPT, a fine-tuned GPT2 model that has been trained to generate tile-based game levels for Super Mario Bros. The results demonstrate that MarioGPT can generate diverse levels and can be text-prompted for controllable level generation, addressing a critical challenge in current PCG techniques.
- Dzmitry Bahdanau, KyungHyun Cho, and Yoshua Bengio. 2014. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014).Google Scholar
- Cameron Browne and Frederic Maire. 2010. Evolutionary game design. IEEE Transactions on Computational Intelligence and AI in Games 2, 1 (2010), 1--16.Google ScholarCross Ref
- Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2018. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. Google ScholarCross Ref
- Matthew C. Fontaine, Ruilin Liu, Ahmed Khalifa, Jignesh Modi, Julian Togelius, Amy K. Hoover, and Stefanos Nikolaidis. 2020. Illuminating Mario Scenes in the Latent Space of a Generative Adversarial Network. Google ScholarCross Ref
- Edoardo Giacomello, Pier Luca Lanzi, and Daniele Loiacono. 2018. Doom level generation using generative adversarial networks. In 2018 IEEE Games, Entertainment, Media Conference (GEM). IEEE, 316--323.Google Scholar
- Miguel González-Duque, Rasmus Berg Palm, Søren Hauberg, and Sebastian Risi. 2022. Mario Plays on a Manifold: Generating Functional Content in Latent Space through Differential Geometry. Google ScholarCross Ref
- Ian Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, and Yoshua Bengio. 2020. Generative adversarial networks. Commun. ACM 63, 11 (2020), 139--144.Google ScholarDigital Library
- Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long short-term memory. Neural computation 9, 8 (1997), 1735--1780.Google ScholarDigital Library
- Zhengbao Jiang, Frank F. Xu, Jun Araki, and Graham Neubig. 2019. How Can We Know What Language Models Know? Google ScholarCross Ref
- Diederik P. Kingma and Jimmy Ba. 2014. Adam: A Method for Stochastic Optimization. Google ScholarCross Ref
- Mike Lewis, Yinhan Liu, Naman Goyal, Marjan Ghazvininejad, Abdelrahman Mohamed, Omer Levy, Ves Stoyanov, and Luke Zettlemoyer. 2019. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension. Google ScholarCross Ref
- Jialin Liu, Sam Snodgrass, Ahmed Khalifa, Sebastian Risi, Georgios N Yannakakis, and Julian Togelius. 2021. Deep learning for procedural content generation. Neural Computing and Applications 33, 1 (2021), 19--37.Google ScholarDigital Library
- Alec Radford, Jeff Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever. 2019. Language Models are Unsupervised Multitask Learners. (2019).Google Scholar
- David E Rumelhart, Geoffrey E Hinton, and Ronald J Williams. 1985. Learning internal representations by error propagation. Technical Report. California Univ San Diego La Jolla Inst for Cognitive Science.Google Scholar
- Victor Sanh, Lysandre Debut, Julien Chaumond, and Thomas Wolf. 2019. DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. Google ScholarCross Ref
- Anurag Sarkar and Seth Cooper. 2021. Dungeon and Platformer Level Blending and Generation using Conditional VAEs. In Proceedings of the IEEE Conference on Games (CoG).Google ScholarDigital Library
- Anurag Sarkar and Seth Cooper. 2021. Generating and Blending Game Levels via Quality-Diversity in the Latent Space of a Variational Autoencoder. In Proceedings of the Foundations of Digital Games.Google ScholarDigital Library
- Anurag Sarkar, Zhihan Yang, and Seth Cooper. 2020. Conditional Level Generation and Game Blending. In Proceedings of the Experimental AI in Games (EXAG) Workshop at AIIDE.Google Scholar
- Noor Shaker, Julian Togelius, and Mark J Nelson. 2016. Procedural content generation in games. (2016).Google Scholar
- Adam M Smith and Michael Mateas. 2011. Answer set programming for procedural content generation: A design space approach. IEEE Transactions on Computational Intelligence and AI in Games 3, 3 (2011), 187--200.Google ScholarCross Ref
- Adam Summerville and Michael Mateas. 2016. Super Mario as a String: Platformer Level Generation Via LSTMs. Google ScholarCross Ref
- Adam Summerville, Sam Snodgrass, Matthew Guzdial, Christoffer Holmgård, Amy K Hoover, Aaron Isaksen, Andy Nealen, and Julian Togelius. 2018. Procedural content generation via machine learning (PCGML). IEEE Transactions on Games 10, 3 (2018), 257--270.Google ScholarCross Ref
- Adam James Summerville, Sam Snodgrass, Michael Mateas, and Santiago Ontañón. 2016. The VGLC: The Video Game Level Corpus. Google ScholarCross Ref
- Graham Todd, Sam Earle, Muhammad Umair Nasir, Michael Cerny Green, and Julian Togelius. 2023. Level Generation Through Large Language Models. In Proceedings of the 18th International Conference on the Foundations of Digital Games. 1--8.Google ScholarDigital Library
- Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. Advances in neural information processing systems 30 (2017).Google Scholar
- Vanessa Volz, Jacob Schrum, Jialin Liu, Simon M. Lucas, Adam Smith, and Sebastian Risi. 2018. Evolving Mario Levels in the Latent Space of a Deep Convolutional Generative Adversarial Network. Google ScholarCross Ref
- Georgios N Yannakakis and Julian Togelius. 2018. Artificial intelligence and games. Vol. 2. Springer.Google ScholarDigital Library
Index Terms
- Prompt-Guided Level Generation
Recommendations
General Video Game Level Generation
GECCO '16: Proceedings of the Genetic and Evolutionary Computation Conference 2016This paper presents a framework and an initial study in general video game level generation, the problem of generating levels for not only a single game but for any game within a specified range. While existing level generators are tailored to a ...
Region-Guided Pixel-Level Label Generation for Weakly Supervised Semantic Segmentation
ICCCV '21: Proceedings of the 4th International Conference on Control and Computer VisionThe lack of reliable segmentation labels is the major obstacles to weakly supervised semantic segmentation. We provide a pseudo-label generation approach based on a deep convolutional neural network, which is supervised by the image-level category ...
Unsupervised Disentanglement Learning Model for Exemplar-Guided Paraphrase Generation
Exemplar-guided paraphrase generation is the task of generating a paraphrase for a source sentence when given another exemplar sentence as syntactic guidance information. The target sentence must convey the semantics of the source sentence in surface form,...
Comments