StochEP: Stochastic Equilibrium Propagation for Spiking Convergent Recurrent Neural Networks
- URL: http://arxiv.org/abs/2511.11320v1
- Date: Fri, 14 Nov 2025 13:58:39 GMT
- Title: StochEP: Stochastic Equilibrium Propagation for Spiking Convergent Recurrent Neural Networks
- Authors: Jiaqi Lin, Yi Jiang, Abhronil Sengupta,
- Abstract summary: Spiking Networks (SNNs) promise energy-efficient, sparse, biologically inspired computation.<n>Training them with Backpropagation Through Time (BPTT) and surrogate gradients achieves strong performance but remains biologically implausible.<n>We propose an Equilibrium Propagation (EP) framework that integrates probabilistic spiking neurons into the EP paradigm.
- Score: 21.560035501130006
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spiking Neural Networks (SNNs) promise energy-efficient, sparse, biologically inspired computation. Training them with Backpropagation Through Time (BPTT) and surrogate gradients achieves strong performance but remains biologically implausible. Equilibrium Propagation (EP) provides a more local and biologically grounded alternative. However, existing EP frameworks, primarily based on deterministic neurons, either require complex mechanisms to handle discontinuities in spiking dynamics or fail to scale beyond simple visual tasks. Inspired by the stochastic nature of biological spiking mechanism and recent hardware trends, we propose a stochastic EP framework that integrates probabilistic spiking neurons into the EP paradigm. This formulation smoothens the optimization landscape, stabilizes training, and enables scalable learning in deep convolutional spiking convergent recurrent neural networks (CRNNs). We provide theoretical guarantees showing that the proposed stochastic EP dynamics approximate deterministic EP under mean-field theory, thereby inheriting its underlying theoretical guarantees. The proposed framework narrows the gap to both BPTT-trained SNNs and EP-trained non-spiking CRNNs in vision benchmarks while preserving locality, highlighting stochastic EP as a promising direction for neuromorphic and on-chip learning.
Related papers
- General Self-Prediction Enhancement for Spiking Neurons [71.01912385372577]
Spiking Neural Networks (SNNs) are highly energy-efficient due to event-driven, sparse computation, but their training is challenged by spike non-differentiability and trade-offs among performance, efficiency, and biological plausibility.<n>We propose a self-prediction enhanced spiking neuron method that generates an internal prediction current from its input-output history to modulate membrane potential.<n>This design offers dual advantages, it creates a continuous gradient path that alleviates vanishing gradients and boosts training stability and accuracy, while also aligning with biological principles, which resembles distal dendritic modulation and error-driven synaptic plasticity.
arXiv Detail & Related papers (2026-01-29T15:08:48Z) - Neuronal Group Communication for Efficient Neural representation [85.36421257648294]
This paper addresses the question of how to build large neural systems that learn efficient, modular, and interpretable representations.<n>We propose Neuronal Group Communication (NGC), a theory-driven framework that reimagines a neural network as a dynamical system of interacting neuronal groups.<n>NGC treats weights as transient interactions between embedding-like neuronal states, with neural computation unfolding through iterative communication among groups of neurons.
arXiv Detail & Related papers (2025-10-19T14:23:35Z) - Training Deep Normalization-Free Spiking Neural Networks with Lateral Inhibition [52.59263087086756]
Training deep neural networks (SNNs) has critically depended on explicit normalization schemes, such as batch normalization.<n>We propose a normalization-free learning framework that incorporates lateral inhibition inspired by cortical circuits.<n>We show that our framework enables stable training of deep SNNs with biological realism and achieves competitive performance without resorting to explicit normalizations.
arXiv Detail & Related papers (2025-09-27T11:11:30Z) - Toward Practical Equilibrium Propagation: Brain-inspired Recurrent Neural Network with Feedback Regulation and Residual Connections [7.464380138405363]
We propose a biologically plau-sible Feedback-regulated REsidual recurrent neural network (FRE-RNN) to study its learning performance.<n>The improvement in con-vergence property reduces the computational cost and train-ing time of EP by orders of magnitude.<n>Our approach substantially enhances the applicabil-ity and practicality of EP in large-scale networks that un-derpin artificial intelligence.
arXiv Detail & Related papers (2025-08-05T15:07:50Z) - Exploiting Heterogeneity in Timescales for Sparse Recurrent Spiking Neural Networks for Energy-Efficient Edge Computing [16.60622265961373]
Spiking Neural Networks (SNNs) represent the forefront of neuromorphic computing.
This paper weaves together three groundbreaking studies that revolutionize SNN performance.
arXiv Detail & Related papers (2024-07-08T23:33:12Z) - Evolutionary algorithms as an alternative to backpropagation for
supervised training of Biophysical Neural Networks and Neural ODEs [12.357635939839696]
We investigate the use of "gradient-estimating" evolutionary algorithms for training biophysically based neural networks.
We find that EAs have several advantages making them desirable over direct BP.
Our findings suggest that biophysical neurons could provide useful benchmarks for testing the limits of BP methods.
arXiv Detail & Related papers (2023-11-17T20:59:57Z) - Heterogeneous Neuronal and Synaptic Dynamics for Spike-Efficient
Unsupervised Learning: Theory and Design Principles [13.521272923545409]
We analytically show that the diversity in neurons' integration/relaxation dynamics improves an RSNN's ability to learn more distinct input patterns (higher memory capacity)
We further prove that heterogeneous Spike-Timing-Dependent-Plasticity (STDP) dynamics of synapses reduce spiking activity but preserve memory capacity.
arXiv Detail & Related papers (2023-02-22T19:48:02Z) - A Theoretical Framework for Inference and Learning in Predictive Coding
Networks [41.58529335439799]
Predictive coding (PC) is an influential theory in computational neuroscience.
We provide a comprehensive theoretical analysis of the properties of PCNs trained with prospective configuration.
arXiv Detail & Related papers (2022-07-21T04:17:55Z) - Scaling Equilibrium Propagation to Deep ConvNets by Drastically Reducing
its Gradient Estimator Bias [62.43908463620527]
In practice, EP does not scale to visual tasks harder than MNIST.
We show that a bias in the gradient estimate of EP, inherent in the use of finite nudging, is responsible for this phenomenon.
These results highlight EP as a scalable approach to compute error gradients in deep neural networks, thereby motivating its hardware implementation.
arXiv Detail & Related papers (2021-01-14T10:23:40Z) - Complexity-based speciation and genotype representation for
neuroevolution [81.21462458089142]
This paper introduces a speciation principle for neuroevolution where evolving networks are grouped into species based on the number of hidden neurons.
The proposed speciation principle is employed in several techniques designed to promote and preserve diversity within species and in the ecosystem as a whole.
arXiv Detail & Related papers (2020-10-11T06:26:56Z) - A Theoretical Framework for Target Propagation [75.52598682467817]
We analyze target propagation (TP), a popular but not yet fully understood alternative to backpropagation (BP)
Our theory shows that TP is closely related to Gauss-Newton optimization and thus substantially differs from BP.
We provide a first solution to this problem through a novel reconstruction loss that improves feedback weight training.
arXiv Detail & Related papers (2020-06-25T12:07:06Z) - Continual Weight Updates and Convolutional Architectures for Equilibrium
Propagation [69.87491240509485]
Equilibrium Propagation (EP) is a biologically inspired alternative algorithm to backpropagation (BP) for training neural networks.
We propose a discrete-time formulation of EP which enables to simplify equations, speed up training and extend EP to CNNs.
Our CNN model achieves the best performance ever reported on MNIST with EP.
arXiv Detail & Related papers (2020-04-29T12:14:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.