Exploring Multiple Neighborhood Neural Cellular Automata (MNNCA) for
Enhanced Texture Learning
- URL: http://arxiv.org/abs/2311.16123v1
- Date: Fri, 27 Oct 2023 15:16:19 GMT
- Title: Exploring Multiple Neighborhood Neural Cellular Automata (MNNCA) for
Enhanced Texture Learning
- Authors: Magnus Petersen
- Abstract summary: Cellular Automata (CA) have long been foundational in simulating dynamical systems.
Recent innovations have brought Neural Cellular Automata (NCA) into the realm of deep learning.
NCA allows NCAs to be trained via gradient descent, enabling them to evolve into specific shapes, generate textures, and mimic behaviors such as swarming.
Our research explores enhancing the NCA framework by incorporating multiple neighborhoods and introducing structured noise for seed states.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Cellular Automata (CA) have long been foundational in simulating dynamical
systems computationally. With recent innovations, this model class has been
brought into the realm of deep learning by parameterizing the CA's update rule
using an artificial neural network, termed Neural Cellular Automata (NCA). This
allows NCAs to be trained via gradient descent, enabling them to evolve into
specific shapes, generate textures, and mimic behaviors such as swarming.
However, a limitation of traditional NCAs is their inability to exhibit
sufficiently complex behaviors, restricting their potential in creative and
modeling tasks. Our research explores enhancing the NCA framework by
incorporating multiple neighborhoods and introducing structured noise for seed
states. This approach is inspired by techniques that have historically
amplified the expressiveness of classical continuous CA. All code and example
videos are publicly available on https://github.com/MagnusPetersen/MNNCA.
Related papers
- Convolutional Neural Networks for Automated Cellular Automaton Classification [0.0]
We implement computer vision techniques to perform an automated classification of elementary cellular automata into the five Li-Packard classes.
We first show that previously developed deep learning approaches have in fact been trained to identify the local update rule.
We then present a convolutional neural network that performs nearly perfectly at identifying the behavioural class.
arXiv Detail & Related papers (2024-09-04T14:21:00Z) - Multi-Texture Synthesis through Signal Responsive Neural Cellular Automata [44.99833362998488]
We train a single NCA for the evolution of multiple textures, based on individual examples.
Our solution provides texture information in the state of each cell, in the form of an internally coded genomic signal, which enables the NCA to generate the expected texture.
arXiv Detail & Related papers (2024-07-08T14:36:20Z) - Harnessing Neural Unit Dynamics for Effective and Scalable Class-Incremental Learning [38.09011520275557]
Class-incremental learning (CIL) aims to train a model to learn new classes from non-stationary data streams without forgetting old ones.
We propose a new kind of connectionist model by tailoring neural unit dynamics that adapt the behavior of neural networks for CIL.
arXiv Detail & Related papers (2024-06-04T15:47:03Z) - NoiseNCA: Noisy Seed Improves Spatio-Temporal Continuity of Neural Cellular Automata [23.73063532045145]
NCA is a class of Cellular Automata where the update rule is parameterized by a neural network.
We show that existing NCA models tend to overfit the training discretization.
We propose a solution that utilizes uniform noise as the initial condition.
arXiv Detail & Related papers (2024-04-09T13:02:33Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - CQural: A Novel CNN based Hybrid Architecture for Quantum Continual
Machine Learning [0.0]
We show that it is possible to circumvent catastrophic forgetting in continual learning with novel hybrid classical-quantum neural networks.
We also claim that if the model is trained with these explanations, it tends to give better performance and learn specific features that are far from the decision boundary.
arXiv Detail & Related papers (2023-05-16T18:19:12Z) - Reducing Catastrophic Forgetting in Self Organizing Maps with
Internally-Induced Generative Replay [67.50637511633212]
A lifelong learning agent is able to continually learn from potentially infinite streams of pattern sensory data.
One major historic difficulty in building agents that adapt is that neural systems struggle to retain previously-acquired knowledge when learning from new samples.
This problem is known as catastrophic forgetting (interference) and remains an unsolved problem in the domain of machine learning to this day.
arXiv Detail & Related papers (2021-12-09T07:11:14Z) - Train your classifier first: Cascade Neural Networks Training from upper
layers to lower layers [54.47911829539919]
We develop a novel top-down training method which can be viewed as an algorithm for searching for high-quality classifiers.
We tested this method on automatic speech recognition (ASR) tasks and language modelling tasks.
The proposed method consistently improves recurrent neural network ASR models on Wall Street Journal, self-attention ASR models on Switchboard, and AWD-LSTM language models on WikiText-2.
arXiv Detail & Related papers (2021-02-09T08:19:49Z) - Incremental Training of a Recurrent Neural Network Exploiting a
Multi-Scale Dynamic Memory [79.42778415729475]
We propose a novel incrementally trained recurrent architecture targeting explicitly multi-scale learning.
We show how to extend the architecture of a simple RNN by separating its hidden state into different modules.
We discuss a training algorithm where new modules are iteratively added to the model to learn progressively longer dependencies.
arXiv Detail & Related papers (2020-06-29T08:35:49Z) - Neural Cellular Automata Manifold [84.08170531451006]
We show that the neural network architecture of the Neural Cellular Automata can be encapsulated in a larger NN.
This allows us to propose a new model that encodes a manifold of NCA, each of them capable of generating a distinct image.
In biological terms, our approach would play the role of the transcription factors, modulating the mapping of genes into specific proteins that drive cellular differentiation.
arXiv Detail & Related papers (2020-06-22T11:41:57Z) - AutoML-Zero: Evolving Machine Learning Algorithms From Scratch [76.83052807776276]
We show that it is possible to automatically discover complete machine learning algorithms just using basic mathematical operations as building blocks.
We demonstrate this by introducing a novel framework that significantly reduces human bias through a generic search space.
We believe these preliminary successes in discovering machine learning algorithms from scratch indicate a promising new direction in the field.
arXiv Detail & Related papers (2020-03-06T19:00:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.