Generative Adversarial Neural Cellular Automata
- URL: http://arxiv.org/abs/2108.04328v1
- Date: Mon, 19 Jul 2021 06:23:11 GMT
- Title: Generative Adversarial Neural Cellular Automata
- Authors: Maximilian Otte, Quentin Delfosse, Johannes Czech, Kristian Kersting
- Abstract summary: We introduce a concept using different initial environments as input while using a single Neural Cellular Automata to produce several outputs.
We also introduce GANCA, a novel algorithm that combines Neural Cellular Automata with Generative Adrial Networks.
- Score: 13.850929935840659
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Motivated by the interaction between cells, the recently introduced concept
of Neural Cellular Automata shows promising results in a variety of tasks. So
far, this concept was mostly used to generate images for a single scenario. As
each scenario requires a new model, this type of generation seems contradictory
to the adaptability of cells in nature. To address this contradiction, we
introduce a concept using different initial environments as input while using a
single Neural Cellular Automata to produce several outputs. Additionally, we
introduce GANCA, a novel algorithm that combines Neural Cellular Automata with
Generative Adversarial Networks, allowing for more generalization through
adversarial training. The experiments show that a single model is capable of
learning several images when presented with different inputs, and that the
adversarially trained model improves drastically on out-of-distribution data
compared to a supervised trained model.
Related papers
- This Probably Looks Exactly Like That: An Invertible Prototypical Network [8.957872207471311]
Prototypical neural networks represent an exciting way forward in realizing human-comprehensible machine learning without concept annotations.
We find that reliance on indirect interpretation functions for prototypical explanations imposes a severe limit on prototypes' informative power.
We propose one such model, called ProtoFlow, by composing a normalizing flow with Gaussian mixture models.
arXiv Detail & Related papers (2024-07-16T21:51:02Z) - Latent Neural Cellular Automata for Resource-Efficient Image Restoration [4.470499157873342]
We introduce the Latent Neural Cellular Automata (LNCA) model, a novel architecture designed to address the resource limitations of neural cellular automata.
Our approach shifts the computation from the conventional input space to a specially designed latent space, relying on a pre-trained autoencoder.
This modification not only reduces the model's resource consumption but also maintains a flexible framework suitable for various applications.
arXiv Detail & Related papers (2024-03-22T14:15:28Z) - Locally adaptive cellular automata for goal-oriented self-organization [14.059479351946386]
We propose a new model class of adaptive cellular automata that allows for the generation of scalable and expressive models.
We show how to implement adaptation by coupling the update rule of the cellular automaton with itself and the system state in a localized way.
arXiv Detail & Related papers (2023-06-12T12:32:23Z) - Growing Isotropic Neural Cellular Automata [63.91346650159648]
We argue that the original Growing NCA model has an important limitation: anisotropy of the learned update rule.
We demonstrate that cell systems can be trained to grow accurate asymmetrical patterns through either of two methods.
arXiv Detail & Related papers (2022-05-03T11:34:22Z) - Meta Internal Learning [88.68276505511922]
Internal learning for single-image generation is a framework, where a generator is trained to produce novel images based on a single image.
We propose a meta-learning approach that enables training over a collection of images, in order to model the internal statistics of the sample image more effectively.
Our results show that the models obtained are as suitable as single-image GANs for many common image applications.
arXiv Detail & Related papers (2021-10-06T16:27:38Z) - Towards self-organized control: Using neural cellular automata to
robustly control a cart-pole agent [62.997667081978825]
We use neural cellular automata to control a cart-pole agent.
We trained the model using deep-Q learning, where the states of the output cells were used as the Q-value estimates to be optimized.
arXiv Detail & Related papers (2021-06-29T10:49:42Z) - Image Generation With Neural Cellular Automatas [1.8275108630751844]
We propose a novel approach to generate images (or other artworks) by using neural cellular automatas (NCAs)
Rather than training NCAs based on single images one by one, we combined the idea with variational autoencoders (VAEs) and hence explored some applications, such as image restoration and style fusion.
arXiv Detail & Related papers (2020-10-10T08:52:52Z) - Reservoir Memory Machines as Neural Computers [70.5993855765376]
Differentiable neural computers extend artificial neural networks with an explicit memory without interference.
We achieve some of the computational capabilities of differentiable neural computers with a model that can be trained very efficiently.
arXiv Detail & Related papers (2020-09-14T12:01:30Z) - Neural Cellular Automata Manifold [84.08170531451006]
We show that the neural network architecture of the Neural Cellular Automata can be encapsulated in a larger NN.
This allows us to propose a new model that encodes a manifold of NCA, each of them capable of generating a distinct image.
In biological terms, our approach would play the role of the transcription factors, modulating the mapping of genes into specific proteins that drive cellular differentiation.
arXiv Detail & Related papers (2020-06-22T11:41:57Z) - Model Fusion via Optimal Transport [64.13185244219353]
We present a layer-wise model fusion algorithm for neural networks.
We show that this can successfully yield "one-shot" knowledge transfer between neural networks trained on heterogeneous non-i.i.d. data.
arXiv Detail & Related papers (2019-10-12T22:07:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.