Predicting Geographic Information with Neural Cellular Automata
- URL: http://arxiv.org/abs/2009.09347v1
- Date: Sun, 20 Sep 2020 03:53:48 GMT
- Title: Predicting Geographic Information with Neural Cellular Automata
- Authors: Mingxiang Chen, Qichang Chen, Lei Gao, Yilin Chen, Zhecheng Wang
- Abstract summary: This paper presents a novel framework using neural cellular automata (NCA) to regenerate and predict geographic information.
The model extends the idea of using NCA to generate/regenerate a specific image by training the model with various geographic data.
- Score: 7.605218364952221
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents a novel framework using neural cellular automata (NCA) to
regenerate and predict geographic information. The model extends the idea of
using NCA to generate/regenerate a specific image by training the model with
various geographic data, and thus, taking the traffic condition map as an
example, the model is able to predict traffic conditions by giving certain
induction information. Our research verified the analogy between NCA and gene
in biology, while the innovation of the model significantly widens the boundary
of possible applications based on NCAs. From our experimental results, the
model shows great potentials in its usability and versatility which are not
available in previous studies. The code for model implementation is available
at https://redacted.
Related papers
- Diffusion-Based Generation of Neural Activity from Disentangled Latent Codes [1.9544534628180867]
We propose a new approach to neural data analysis that leverages advances in conditional generative modeling.
We apply our model, called Generating Neural Observations Conditioned on Codes with High Information, to time series neural data.
In comparison to a VAE-based sequential autoencoder, GNOCCHI learns higher-quality latent spaces that are more clearly structured and more disentangled with respect to key behavioral variables.
arXiv Detail & Related papers (2024-07-30T21:07:09Z) - Functional Neural Networks: Shift invariant models for functional data
with applications to EEG classification [0.0]
We introduce a new class of neural networks that are shift invariant and preserve smoothness of the data: functional neural networks (FNNs)
For this, we use methods from functional data analysis (FDA) to extend multi-layer perceptrons and convolutional neural networks to functional data.
We show that the models outperform a benchmark model from FDA in terms of accuracy and successfully use FNNs to classify electroencephalography (EEG) data.
arXiv Detail & Related papers (2023-01-14T09:41:21Z) - Classification of EEG Motor Imagery Using Deep Learning for
Brain-Computer Interface Systems [79.58173794910631]
A trained T1 class Convolutional Neural Network (CNN) model will be used to examine its ability to successfully identify motor imagery.
In theory, and if the model has been trained accurately, it should be able to identify a class and label it accordingly.
The CNN model will then be restored and used to try and identify the same class of motor imagery data using much smaller sampled data.
arXiv Detail & Related papers (2022-05-31T17:09:46Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Sparse Flows: Pruning Continuous-depth Models [107.98191032466544]
We show that pruning improves generalization for neural ODEs in generative modeling.
We also show that pruning finds minimal and efficient neural ODE representations with up to 98% less parameters compared to the original network, without loss of accuracy.
arXiv Detail & Related papers (2021-06-24T01:40:17Z) - The Neural Coding Framework for Learning Generative Models [91.0357317238509]
We propose a novel neural generative model inspired by the theory of predictive processing in the brain.
In a similar way, artificial neurons in our generative model predict what neighboring neurons will do, and adjust their parameters based on how well the predictions matched reality.
arXiv Detail & Related papers (2020-12-07T01:20:38Z) - A Bayesian Perspective on Training Speed and Model Selection [51.15664724311443]
We show that a measure of a model's training speed can be used to estimate its marginal likelihood.
We verify our results in model selection tasks for linear models and for the infinite-width limit of deep neural networks.
Our results suggest a promising new direction towards explaining why neural networks trained with gradient descent are biased towards functions that generalize well.
arXiv Detail & Related papers (2020-10-27T17:56:14Z) - Neural Cellular Automata Manifold [84.08170531451006]
We show that the neural network architecture of the Neural Cellular Automata can be encapsulated in a larger NN.
This allows us to propose a new model that encodes a manifold of NCA, each of them capable of generating a distinct image.
In biological terms, our approach would play the role of the transcription factors, modulating the mapping of genes into specific proteins that drive cellular differentiation.
arXiv Detail & Related papers (2020-06-22T11:41:57Z) - Learning of Discrete Graphical Models with Neural Networks [15.171938155576566]
We introduce NeurISE, a neural net based algorithm for graphical model learning.
NeurISE is seen to be a better alternative to GRISE when the energy function of the true model has a high order.
We also show a variant of NeurISE that can be used to learn a neural net representation for the full energy function of the true model.
arXiv Detail & Related papers (2020-06-21T23:34:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.