Spontaneous Emergence of Computation in Network Cascades
- URL: http://arxiv.org/abs/2204.11956v2
- Date: Wed, 27 Apr 2022 14:38:26 GMT
- Title: Spontaneous Emergence of Computation in Network Cascades
- Authors: Galen Wilkerson, Sotiris Moschoyiannis, Henrik Jeldtoft Jensen
- Abstract summary: We show that computation of complex Boolean functions arises spontaneously in threshold networks as a function of connectivity and antagonism (inhibition)
We also show that the optimal fraction of inhibition observed here supports results in computational neuroscience, relating to optimal information processing.
- Score: 0.7734726150561089
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neuronal network computation and computation by avalanche supporting networks
are of interest to the fields of physics, computer science (computation theory
as well as statistical or machine learning) and neuroscience. Here we show that
computation of complex Boolean functions arises spontaneously in threshold
networks as a function of connectivity and antagonism (inhibition), computed by
logic automata (motifs) in the form of computational cascades. We explain the
emergent inverse relationship between the computational complexity of the
motifs and their rank-ordering by function probabilities due to motifs, and its
relationship to symmetry in function space. We also show that the optimal
fraction of inhibition observed here supports results in computational
neuroscience, relating to optimal information processing.
Related papers
- An exact mathematical description of computation with transient
spatiotemporal dynamics in a complex-valued neural network [33.7054351451505]
We study a complex-valued neural network (-NN) with linear time-delayed interactions.
cv-NN displays sophisticated dynamics, including partially synchronized chimera adaptable'' states.
We demonstrate that computations in cv-NN computation are decodable by living biological neurons.
arXiv Detail & Related papers (2023-11-28T02:23:30Z) - Inferring Inference [7.11780383076327]
We develop a framework for inferring canonical distributed computations from large-scale neural activity patterns.
We simulate recordings for a model brain that implicitly implements an approximate inference algorithm on a probabilistic graphical model.
Overall, this framework provides a new tool for discovering interpretable structure in neural recordings.
arXiv Detail & Related papers (2023-10-04T22:12:11Z) - Cross-Frequency Coupling Increases Memory Capacity in Oscillatory Neural
Networks [69.42260428921436]
Cross-frequency coupling (CFC) is associated with information integration across populations of neurons.
We construct a model of CFC which predicts a computational role for observed $theta - gamma$ oscillatory circuits in the hippocampus and cortex.
We show that the presence of CFC increases the memory capacity of a population of neurons connected by plastic synapses.
arXiv Detail & Related papers (2022-04-05T17:13:36Z) - Spatiotemporal Patterns in Neurobiology: An Overview for Future
Artificial Intelligence [0.0]
We argue that computational models are key tools for elucidating possible functionalities that emerge from network interactions.
Here we review several classes of models including spiking neurons, integrate and fire neurons.
We hope these studies will inform future developments in artificial intelligence algorithms as well as help validate our understanding of brain processes.
arXiv Detail & Related papers (2022-03-29T10:28:01Z) - Neural Network Approximations of Compositional Functions With
Applications to Dynamical Systems [3.660098145214465]
We develop an approximation theory for compositional functions and their neural network approximations.
We identify a set of key features of compositional functions and the relationship between the features and the complexity of neural networks.
In addition to function approximations, we prove several formulae of error upper bounds for neural networks.
arXiv Detail & Related papers (2020-12-03T04:40:25Z) - The distribution of inhibitory neurons in the C. elegans connectome
facilitates self-optimization of coordinated neural activity [78.15296214629433]
The nervous system of the nematode Caenorhabditis elegans exhibits remarkable complexity despite the worm's small size.
A general challenge is to better understand the relationship between neural organization and neural activity at the system level.
We implemented an abstract simulation model of the C. elegans connectome that approximates the neurotransmitter identity of each neuron.
arXiv Detail & Related papers (2020-10-28T23:11:37Z) - On the Self-Repair Role of Astrocytes in STDP Enabled Unsupervised SNNs [1.0009912692042526]
This work goes beyond the focus of current neuromorphic computing architectures on computational models for neuron and synapse.
We explore the role of glial cells in fault-tolerant capacity of Spiking Neural Networks trained in an unsupervised fashion using Spike-Timing Dependent Plasticity (STDP)
We characterize the degree of self-repair that can be enabled in such networks with varying degree of faults ranging from 50% - 90% and evaluate our proposal on the MNIST and Fashion-MNIST datasets.
arXiv Detail & Related papers (2020-09-08T01:14:53Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Optimal Learning with Excitatory and Inhibitory synapses [91.3755431537592]
I study the problem of storing associations between analog signals in the presence of correlations.
I characterize the typical learning performance in terms of the power spectrum of random input and output processes.
arXiv Detail & Related papers (2020-05-25T18:25:54Z) - Spiking Neural Networks Hardware Implementations and Challenges: a
Survey [53.429871539789445]
Spiking Neural Networks are cognitive algorithms mimicking neuron and synapse operational principles.
We present the state of the art of hardware implementations of spiking neural networks.
We discuss the strategies employed to leverage the characteristics of these event-driven algorithms at the hardware level.
arXiv Detail & Related papers (2020-05-04T13:24:00Z) - Space of Functions Computed by Deep-Layered Machines [74.13735716675987]
We study the space of functions computed by random-layered machines, including deep neural networks and Boolean circuits.
Investigating the distribution of Boolean functions computed on the recurrent and layer-dependent architectures, we find that it is the same in both models.
arXiv Detail & Related papers (2020-04-19T18:31:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.