Overcoming the Weight Transport Problem via Spike-Timing-Dependent
Weight Inference
- URL: http://arxiv.org/abs/2003.03988v4
- Date: Wed, 11 Aug 2021 13:25:03 GMT
- Title: Overcoming the Weight Transport Problem via Spike-Timing-Dependent
Weight Inference
- Authors: Nasir Ahmad, Luca Ambrogioni, Marcel A. J. van Gerven
- Abstract summary: We show that the use of spike timing alone outcompetes existing biologically plausible methods for synaptic weight inference in spiking neural network models.
Our proposed method is more flexible, being applicable to any spiking neuron model, is conservative in how many parameters are required for implementation and can be deployed in an online-fashion with minimal computational overhead.
- Score: 9.948484577581796
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a solution to the weight transport problem, which questions the
biological plausibility of the backpropagation algorithm. We derive our method
based upon a theoretical analysis of the (approximate) dynamics of leaky
integrate-and-fire neurons. We show that the use of spike timing alone
outcompetes existing biologically plausible methods for synaptic weight
inference in spiking neural network models. Furthermore, our proposed method is
more flexible, being applicable to any spiking neuron model, is conservative in
how many parameters are required for implementation and can be deployed in an
online-fashion with minimal computational overhead. These features, together
with its biological plausibility, make it an attractive mechanism underlying
weight inference at single synapses.
Related papers
- Neuronal Group Communication for Efficient Neural representation [85.36421257648294]
This paper addresses the question of how to build large neural systems that learn efficient, modular, and interpretable representations.<n>We propose Neuronal Group Communication (NGC), a theory-driven framework that reimagines a neural network as a dynamical system of interacting neuronal groups.<n>NGC treats weights as transient interactions between embedding-like neuronal states, with neural computation unfolding through iterative communication among groups of neurons.
arXiv Detail & Related papers (2025-10-19T14:23:35Z) - Deep Learning without Weight Symmetry [2.3462002656701966]
Backpropagation, a foundational algorithm for training artificial neural networks, predominates in contemporary deep learning.<n>Backpropagation relies on precise symmetry between feedforward and feedback weights to accurately propagate gradient signals that assign credit.<n>We introduce the Product Feedback Alignment (PFA) algorithm to solve the longstanding problem of credit assignment in the brain.
arXiv Detail & Related papers (2024-05-31T03:11:19Z) - Correlative Information Maximization: A Biologically Plausible Approach
to Supervised Deep Neural Networks without Weight Symmetry [43.584567991256925]
We propose a new normative approach to describe the signal propagation in biological neural networks in both forward and backward directions.
This framework addresses many concerns about the biological-plausibility of conventional artificial neural networks and the backpropagation algorithm.
Our approach provides a natural resolution to the weight symmetry problem between forward and backward signal propagation paths.
arXiv Detail & Related papers (2023-06-07T22:14:33Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Generalization of generative model for neuronal ensemble inference
method [0.0]
In this study, we extend the range of the variable for expressing the neuronal state, and generalize the likelihood of the model for extended variables.
This generalization without restriction of the binary input enables us to perform soft clustering and apply the method to non-stationary neuroactivity data.
arXiv Detail & Related papers (2022-11-07T07:58:29Z) - Understanding Weight Similarity of Neural Networks via Chain
Normalization Rule and Hypothesis-Training-Testing [58.401504709365284]
We present a weight similarity measure that can quantify the weight similarity of non-volution neural networks.
We first normalize the weights of neural networks by a chain normalization rule, which is used to introduce weight-training representation learning.
We extend traditional hypothesis-testing method to validate the hypothesis on the weight similarity of neural networks.
arXiv Detail & Related papers (2022-08-08T19:11:03Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Formalizing Generalization and Robustness of Neural Networks to Weight
Perturbations [58.731070632586594]
We provide the first formal analysis for feed-forward neural networks with non-negative monotone activation functions against weight perturbations.
We also design a new theory-driven loss function for training generalizable and robust neural networks against weight perturbations.
arXiv Detail & Related papers (2021-03-03T06:17:03Z) - Non-Singular Adversarial Robustness of Neural Networks [58.731070632586594]
Adrial robustness has become an emerging challenge for neural network owing to its over-sensitivity to small input perturbations.
We formalize the notion of non-singular adversarial robustness for neural networks through the lens of joint perturbations to data inputs as well as model weights.
arXiv Detail & Related papers (2021-02-23T20:59:30Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - EqSpike: Spike-driven Equilibrium Propagation for Neuromorphic
Implementations [9.952561670370804]
We develop a spiking neural network algorithm called EqSpike, compatible with neuromorphic systems.
We show that EqSpike implemented in silicon neuromorphic technology could reduce the energy consumption of inference and training respectively.
arXiv Detail & Related papers (2020-10-15T16:25:29Z) - Online neural connectivity estimation with ensemble stimulation [5.156484100374058]
We propose a method based on noisy group testing that drastically increases the efficiency of this process in sparse networks.
We show that it is possible to recover binarized network connectivity with a number of tests that grows only logarithmically with population size.
We also demonstrate the feasibility of inferring connectivity for networks of up to tens of thousands of neurons online.
arXiv Detail & Related papers (2020-07-27T23:47:03Z) - Latent Network Structure Learning from High Dimensional Multivariate
Point Processes [5.079425170410857]
We propose a new class of nonstationary Hawkes processes to characterize the complex processes underlying the observed data.
We estimate the latent network structure using an efficient sparse least squares estimation approach.
We demonstrate the efficacy of our proposed method through simulation studies and an application to a neuron spike train data set.
arXiv Detail & Related papers (2020-04-07T17:48:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.