Growing Steerable Neural Cellular Automata
- URL: http://arxiv.org/abs/2302.10197v2
- Date: Wed, 17 May 2023 15:34:32 GMT
- Title: Growing Steerable Neural Cellular Automata
- Authors: Ettore Randazzo, Alexander Mordvintsev and Craig Fouts
- Abstract summary: In the original implementation of Neural Cellular Automata, cells are incapable of adjusting their own orientation.
We make each cell responsible for its own orientation by allowing it to "turn" as determined by an adjustable internal state.
We show that we can train Steerable NCA in similar but simpler ways than their Isotropic variant by: (1) breaking symmetries using only two seeds, or (2) introducing a rotation-invariant training objective.
- Score: 63.91346650159648
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural Cellular Automata (NCA) models have shown remarkable capacity for
pattern formation and complex global behaviors stemming from local
coordination. However, in the original implementation of NCA, cells are
incapable of adjusting their own orientation, and it is the responsibility of
the model designer to orient them externally. A recent isotropic variant of NCA
(Growing Isotropic Neural Cellular Automata) makes the model
orientation-independent - cells can no longer tell up from down, nor left from
right - by removing its dependency on perceiving the gradient of spatial states
in its neighborhood. In this work, we revisit NCA with a different approach: we
make each cell responsible for its own orientation by allowing it to "turn" as
determined by an adjustable internal state. The resulting Steerable NCA
contains cells of varying orientation embedded in the same pattern. We observe
how, while Isotropic NCA are orientation-agnostic, Steerable NCA have
chirality: they have a predetermined left-right symmetry. We therefore show
that we can train Steerable NCA in similar but simpler ways than their
Isotropic variant by: (1) breaking symmetries using only two seeds, or (2)
introducing a rotation-invariant training objective and relying on asynchronous
cell updates to break the up-down symmetry of the system.
Related papers
- NoiseNCA: Noisy Seed Improves Spatio-Temporal Continuity of Neural Cellular Automata [23.73063532045145]
NCA is a class of Cellular Automata where the update rule is parameterized by a neural network.
We show that existing NCA models tend to overfit the training discretization.
We propose a solution that utilizes uniform noise as the initial condition.
arXiv Detail & Related papers (2024-04-09T13:02:33Z) - RigLSTM: Recurrent Independent Grid LSTM for Generalizable Sequence
Learning [75.61681328968714]
We propose recurrent independent Grid LSTM (RigLSTM) to exploit the underlying modular structure of the target task.
Our model adopts cell selection, input feature selection, hidden state selection, and soft state updating to achieve a better generalization ability.
arXiv Detail & Related papers (2023-11-03T07:40:06Z) - Learning spatio-temporal patterns with Neural Cellular Automata [0.0]
We train NCA to learn complex dynamics from time series of images and PDE trajectories.
We extend NCA to capture both transient and stable structures within the same system.
Being able to learn arbitrary dynamics gives NCA great potential as a data driven modelling framework.
arXiv Detail & Related papers (2023-10-23T11:16:32Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Neural Cellular Automata Can Respond to Signals [0.0]
We show that NCAs can be trained to respond to signals.
Two types of signal are used: internal (genomically-coded) signals, and external (environmental) signals.
Results show NCAs are able to grow into multiple distinct forms based on internal signals, and are able to change colour based on external signals.
arXiv Detail & Related papers (2023-05-22T12:26:46Z) - E(n)-equivariant Graph Neural Cellular Automata [4.168157981135698]
We propose a class of isotropic automata that we call E(n)-GNCAs.
These models are lightweight, but can nevertheless handle large graphs, capture complex dynamics and exhibit emergent self-organising behaviours.
We showcase the broad and successful applicability of E(n)-GNCAs on three different tasks.
arXiv Detail & Related papers (2023-01-25T10:17:07Z) - Growing Isotropic Neural Cellular Automata [63.91346650159648]
We argue that the original Growing NCA model has an important limitation: anisotropy of the learned update rule.
We demonstrate that cell systems can be trained to grow accurate asymmetrical patterns through either of two methods.
arXiv Detail & Related papers (2022-05-03T11:34:22Z) - Emergence of Lie symmetries in functional architectures learned by CNNs [63.69764116066748]
We study the spontaneous development of symmetries in the early layers of a Convolutional Neural Network (CNN) during learning on natural images.
Our architecture is built in such a way to mimic the early stages of biological visual systems.
arXiv Detail & Related papers (2021-04-17T13:23:26Z) - Neural Cellular Automata Manifold [84.08170531451006]
We show that the neural network architecture of the Neural Cellular Automata can be encapsulated in a larger NN.
This allows us to propose a new model that encodes a manifold of NCA, each of them capable of generating a distinct image.
In biological terms, our approach would play the role of the transcription factors, modulating the mapping of genes into specific proteins that drive cellular differentiation.
arXiv Detail & Related papers (2020-06-22T11:41:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.