Latent Equilibrium: A unified learning theory for arbitrarily fast
computation with arbitrarily slow neurons
- URL: http://arxiv.org/abs/2110.14549v1
- Date: Wed, 27 Oct 2021 16:15:55 GMT
- Title: Latent Equilibrium: A unified learning theory for arbitrarily fast
computation with arbitrarily slow neurons
- Authors: Paul Haider, Benjamin Ellenberger, Laura Kriener, Jakob Jordan, Walter
Senn, Mihai A. Petrovici
- Abstract summary: We introduce Latent Equilibrium, a new framework for inference and learning in networks of slow components.
We derive disentangled neuron and synapse dynamics from a prospective energy function.
We show how our principle can be applied to detailed models of cortical microcircuitry.
- Score: 0.7340017786387767
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The response time of physical computational elements is finite, and neurons
are no exception. In hierarchical models of cortical networks each layer thus
introduces a response lag. This inherent property of physical dynamical systems
results in delayed processing of stimuli and causes a timing mismatch between
network output and instructive signals, thus afflicting not only inference, but
also learning. We introduce Latent Equilibrium, a new framework for inference
and learning in networks of slow components which avoids these issues by
harnessing the ability of biological neurons to phase-advance their output with
respect to their membrane potential. This principle enables quasi-instantaneous
inference independent of network depth and avoids the need for phased
plasticity or computationally expensive network relaxation phases. We jointly
derive disentangled neuron and synapse dynamics from a prospective energy
function that depends on a network's generalized position and momentum. The
resulting model can be interpreted as a biologically plausible approximation of
error backpropagation in deep cortical networks with continuous-time, leaky
neuronal dynamics and continuously active, local plasticity. We demonstrate
successful learning of standard benchmark datasets, achieving competitive
performance using both fully-connected and convolutional architectures, and
show how our principle can be applied to detailed models of cortical
microcircuitry. Furthermore, we study the robustness of our model to
spatio-temporal substrate imperfections to demonstrate its feasibility for
physical realization, be it in vivo or in silico.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Spatio-temporal Structure of Excitation and Inhibition Emerges in Spiking Neural Networks with and without Biologically Plausible Constraints [0.06752396542927405]
We present a Spiking Neural Network (SNN) model that incorporates learnable synaptic delays.
We implement a dynamic pruning strategy that combines DEEP R for connection removal and RigL for connection.
We observed that the reintroduction-temporal patterns of excitation and inhibition appeared in the more biologically plausible model as well.
arXiv Detail & Related papers (2024-07-07T11:55:48Z) - Backpropagation through space, time, and the brain [2.10686639478348]
We introduce General Latent Equilibrium, a computational framework for fully local-temporal credit assignment in physical, dynamical networks of neurons.
In particular, GLE exploits the morphology of dendritic trees to enable more complex information storage and processing in single neurons.
arXiv Detail & Related papers (2024-03-25T16:57:02Z) - Disentangling the Causes of Plasticity Loss in Neural Networks [55.23250269007988]
We show that loss of plasticity can be decomposed into multiple independent mechanisms.
We show that a combination of layer normalization and weight decay is highly effective at maintaining plasticity in a variety of synthetic nonstationary learning tasks.
arXiv Detail & Related papers (2024-02-29T00:02:33Z) - ELiSe: Efficient Learning of Sequences in Structured Recurrent Networks [1.5931140598271163]
We build a model for efficient learning sequences using only local always-on and phase-free plasticity.
We showcase the capabilities of ELiSe in a mock-up of birdsong learning, and demonstrate its flexibility with respect to parametrization.
arXiv Detail & Related papers (2024-02-26T17:30:34Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Theory of coupled neuronal-synaptic dynamics [3.626013617212667]
In neural circuits, synaptic strengths influence neuronal activity by shaping network dynamics.
We study a recurrent-network model in which neuronal units and synaptic couplings are interacting dynamic variables.
We show that adding Hebbian plasticity slows activity in chaotic networks and can induce chaos.
arXiv Detail & Related papers (2023-02-17T16:42:59Z) - Impact of spiking neurons leakages and network recurrences on
event-based spatio-temporal pattern recognition [0.0]
Spiking neural networks coupled with neuromorphic hardware and event-based sensors are getting increased interest for low-latency and low-power inference at the edge.
We explore the impact of synaptic and membrane leakages in spiking neurons.
arXiv Detail & Related papers (2022-11-14T21:34:02Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.