Learning Controllable Content Generators
- URL: http://arxiv.org/abs/2105.02993v1
- Date: Thu, 6 May 2021 22:15:51 GMT
- Title: Learning Controllable Content Generators
- Authors: Sam Earle, Maria Edwards, Ahmed Khalifa, Philip Bontrager and Julian
Togelius
- Abstract summary: We train generators capable of producing controllably diverse output, by making them "goal-aware"
We show that the resulting level generators are capable of exploring the space of possible levels in a targeted, controllable manner.
- Score: 5.5805433423452895
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: It has recently been shown that reinforcement learning can be used to train
generators capable of producing high-quality game levels, with quality defined
in terms of some user-specified heuristic. To ensure that these generators'
output is sufficiently diverse (that is, not amounting to the reproduction of a
single optimal level configuration), the generation process is constrained such
that the initial seed results in some variance in the generator's output.
However, this results in a loss of control over the generated content for the
human user. We propose to train generators capable of producing controllably
diverse output, by making them "goal-aware." To this end, we add conditional
inputs representing how close a generator is to some heuristic, and also modify
the reward mechanism to incorporate that value. Testing on multiple domains, we
show that the resulting level generators are capable of exploring the space of
possible levels in a targeted, controllable manner, producing levels of
comparable quality as their goal-unaware counterparts, that are diverse along
designer-specified dimensions.
Related papers
- What Comes Next? Evaluating Uncertainty in Neural Text Generators
Against Human Production Variability [28.403105682913374]
We characterise the extent to which human production varies lexically, syntactically, and semantically across four Natural Language Generation (NLG) tasks.
We then inspect the space of output strings shaped by a generation system's predicted probability distribution and decoding algorithm to probe its uncertainty.
We analyse NLG models and decoding strategies, demonstrating that probing a generator with multiple samples provides the level of detail necessary to gain understanding of a model's representation of uncertainty.
arXiv Detail & Related papers (2023-05-19T14:41:55Z) - Contrastive Learning for Diverse Disentangled Foreground Generation [67.81298739373766]
We introduce a new method for diverse foreground generation with explicit control over various factors.
We leverage contrastive learning with latent codes to generate diverse foreground results for the same masked input.
Experiments demonstrate the superiority of our method over state-of-the-arts in result diversity and generation controllability.
arXiv Detail & Related papers (2022-11-04T18:51:04Z) - Learning Probabilistic Models from Generator Latent Spaces with Hat EBM [81.35199221254763]
This work proposes a method for using any generator network as the foundation of an Energy-Based Model (EBM)
Experiments show strong performance of the proposed method on (1) unconditional ImageNet synthesis at 128x128 resolution, (2) refining the output of existing generators, and (3) learning EBMs that incorporate non-probabilistic generators.
arXiv Detail & Related papers (2022-10-29T03:55:34Z) - Start Small: Training Game Level Generators from Nothing by Learning at
Multiple Sizes [0.0]
A procedural level generator is a tool that generates levels from noise.
One approach to build generators is using machine learning, but given the training data rarity, multiple methods have been proposed to train generators from nothing.
This paper proposes a novel approach to train generators from nothing by learning at multiple level sizes starting from a small size up to the desired sizes.
arXiv Detail & Related papers (2022-09-29T18:52:54Z) - Joint Generator-Ranker Learning for Natural Language Generation [99.16268050116717]
JGR is a novel joint training algorithm that integrates the generator and the ranker in a single framework.
By iteratively updating the generator and the ranker, JGR can effectively harmonize their learning and enhance their quality jointly.
arXiv Detail & Related papers (2022-06-28T12:58:30Z) - Collaging Class-specific GANs for Semantic Image Synthesis [68.87294033259417]
We propose a new approach for high resolution semantic image synthesis.
It consists of one base image generator and multiple class-specific generators.
Experiments show that our approach can generate high quality images in high resolution.
arXiv Detail & Related papers (2021-10-08T17:46:56Z) - Illuminating Diverse Neural Cellular Automata for Level Generation [5.294599496581041]
We present a method of generating a collection of neural cellular automata (NCA) to design video game levels.
Our approach can train diverse level generators, whose output levels vary based on aesthetic or functional criteria.
We apply our new method to generate level generators for several 2D tile-based games: a maze game, Sokoban, and Zelda.
arXiv Detail & Related papers (2021-09-12T11:17:31Z) - Slimmable Generative Adversarial Networks [54.61774365777226]
Generative adversarial networks (GANs) have achieved remarkable progress in recent years, but the continuously growing scale of models makes them challenging to deploy widely in practical applications.
In this paper, we introduce slimmable GANs, which can flexibly switch the width of the generator to accommodate various quality-efficiency trade-offs at runtime.
arXiv Detail & Related papers (2020-12-10T13:35:22Z) - Unsupervised Controllable Generation with Self-Training [90.04287577605723]
controllable generation with GANs remains a challenging research problem.
We propose an unsupervised framework to learn a distribution of latent codes that control the generator through self-training.
Our framework exhibits better disentanglement compared to other variants such as the variational autoencoder.
arXiv Detail & Related papers (2020-07-17T21:50:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.