ResNets, NeuralODEs and CT-RNNs are Particular Neural Regulatory
Networks
- URL: http://arxiv.org/abs/2002.12776v3
- Date: Thu, 19 Mar 2020 07:20:12 GMT
- Title: ResNets, NeuralODEs and CT-RNNs are Particular Neural Regulatory
Networks
- Authors: Radu Grosu
- Abstract summary: This paper shows that ResNets, NeuralODEs, and CT-RNNs, are particular neural regulatory networks (NRNs)
NRNs are a biophysical model for the nonspiking neurons encountered in small species, such as the C.elegans nematode, and in the retina of large species.
For a given approximation task, this considerable succinctness allows to learn a very small and therefore understandable NRN.
- Score: 10.518340300810504
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper shows that ResNets, NeuralODEs, and CT-RNNs, are particular neural
regulatory networks (NRNs), a biophysical model for the nonspiking neurons
encountered in small species, such as the C.elegans nematode, and in the retina
of large species. Compared to ResNets, NeuralODEs and CT-RNNs, NRNs have an
additional multiplicative term in their synaptic computation, allowing them to
adapt to each particular input. This additional flexibility makes NRNs $M$
times more succinct than NeuralODEs and CT-RNNs, where $M$ is proportional to
the size of the training set. Moreover, as NeuralODEs and CT-RNNs are $N$ times
more succinct than ResNets, where $N$ is the number of integration steps
required to compute the output $F(x)$ for a given input $x$, NRNs are in total
$M\,{\cdot}\,N$ more succinct than ResNets. For a given approximation task,
this considerable succinctness allows to learn a very small and therefore
understandable NRN, whose behavior can be explained in terms of well
established architectural motifs, that NRNs share with gene regulatory
networks, such as, activation, inhibition, sequentialization, mutual exclusion,
and synchronization. To the best of our knowledge, this paper unifies for the
first time the mainstream work on deep neural networks with the one in biology
and neuroscience in a quantitative fashion.
Related papers
- Accurate Mapping of RNNs on Neuromorphic Hardware with Adaptive Spiking Neurons [2.9410174624086025]
We present a $SigmaDelta$-low-pass RNN (lpRNN) for mapping rate-based RNNs to spiking neural networks (SNNs)
An adaptive spiking neuron model encodes signals using $SigmaDelta$-modulation and enables precise mapping.
We demonstrate the implementation of the lpRNN on Intel's neuromorphic research chip Loihi.
arXiv Detail & Related papers (2024-07-18T14:06:07Z) - Novel Kernel Models and Exact Representor Theory for Neural Networks Beyond the Over-Parameterized Regime [52.00917519626559]
This paper presents two models of neural-networks and their training applicable to neural networks of arbitrary width, depth and topology.
We also present an exact novel representor theory for layer-wise neural network training with unregularized gradient descent in terms of a local-extrinsic neural kernel (LeNK)
This representor theory gives insight into the role of higher-order statistics in neural network training and the effect of kernel evolution in neural-network kernel models.
arXiv Detail & Related papers (2024-05-24T06:30:36Z) - Random-coupled Neural Network [17.53731608985241]
Pulse-coupled neural network (PCNN) is a well applicated model for imitating the characteristics of the human brain in computer vision and neural network fields.
In this study, random-coupled neural network (RCNN) is proposed.
It overcomes difficulties in PCNN's neuromorphic computing via a random inactivation process.
arXiv Detail & Related papers (2024-03-26T09:13:06Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Extrapolation and Spectral Bias of Neural Nets with Hadamard Product: a
Polynomial Net Study [55.12108376616355]
The study on NTK has been devoted to typical neural network architectures, but is incomplete for neural networks with Hadamard products (NNs-Hp)
In this work, we derive the finite-width-K formulation for a special class of NNs-Hp, i.e., neural networks.
We prove their equivalence to the kernel regression predictor with the associated NTK, which expands the application scope of NTK.
arXiv Detail & Related papers (2022-09-16T06:36:06Z) - Linear Leaky-Integrate-and-Fire Neuron Model Based Spiking Neural
Networks and Its Mapping Relationship to Deep Neural Networks [7.840247953745616]
Spiking neural networks (SNNs) are brain-inspired machine learning algorithms with merits such as biological plausibility and unsupervised learning capability.
This paper establishes a precise mathematical mapping between the biological parameters of the Linear Leaky-Integrate-and-Fire model (LIF)/SNNs and the parameters of ReLU-AN/Deep Neural Networks (DNNs)
arXiv Detail & Related papers (2022-05-31T17:02:26Z) - Combining Spiking Neural Network and Artificial Neural Network for
Enhanced Image Classification [1.8411688477000185]
spiking neural networks (SNNs) that more closely resemble biological brain synapses have attracted attention owing to their low power consumption.
We build versatile hybrid neural networks (HNNs) that improve the concerned performance.
arXiv Detail & Related papers (2021-02-21T12:03:16Z) - Exploiting Heterogeneity in Operational Neural Networks by Synaptic
Plasticity [87.32169414230822]
Recently proposed network model, Operational Neural Networks (ONNs), can generalize the conventional Convolutional Neural Networks (CNNs)
In this study the focus is drawn on searching the best-possible operator set(s) for the hidden neurons of the network based on the Synaptic Plasticity paradigm that poses the essential learning theory in biological neurons.
Experimental results over highly challenging problems demonstrate that the elite ONNs even with few neurons and layers can achieve a superior learning performance than GIS-based ONNs.
arXiv Detail & Related papers (2020-08-21T19:03:23Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Self-Organized Operational Neural Networks with Generative Neurons [87.32169414230822]
ONNs are heterogenous networks with a generalized neuron model that can encapsulate any set of non-linear operators.
We propose Self-organized ONNs (Self-ONNs) with generative neurons that have the ability to adapt (optimize) the nodal operator of each connection.
arXiv Detail & Related papers (2020-04-24T14:37:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.