Sleep-Based Homeostatic Regularization for Stabilizing Spike-Timing-Dependent Plasticity in Recurrent Spiking Neural Networks
- URL: http://arxiv.org/abs/2601.08447v1
- Date: Tue, 13 Jan 2026 11:17:30 GMT
- Title: Sleep-Based Homeostatic Regularization for Stabilizing Spike-Timing-Dependent Plasticity in Recurrent Spiking Neural Networks
- Authors: Andreas Massey, Aliaksandr Hubin, Stefano Nichele, Solve Sæbø,
- Abstract summary: Spike-timing-dependent plasticity (STDP) provides a biologically-plausible learning mechanism for spiking neural networks (SNNs)<n>We propose a neuromorphic regularization scheme inspired by the synaptic homeostasis hypothesis: periodic offline phases during which external inputs are suppressed, synaptic weights undergo decay toward a homeostatic baseline, and spontaneous activity enables memory consolidation.
- Score: 14.487258585834374
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Spike-timing-dependent plasticity (STDP) provides a biologically-plausible learning mechanism for spiking neural networks (SNNs); however, Hebbian weight updates in architectures with recurrent connections suffer from pathological weight dynamics: unbounded growth, catastrophic forgetting, and loss of representational diversity. We propose a neuromorphic regularization scheme inspired by the synaptic homeostasis hypothesis: periodic offline phases during which external inputs are suppressed, synaptic weights undergo stochastic decay toward a homeostatic baseline, and spontaneous activity enables memory consolidation. We demonstrate that this sleep-wake cycle prevents weight saturation while preserving learned structure. Empirically, we find that low to intermediate sleep durations (10-20\% of training) improve stability on MNIST-like benchmarks in our STDP-SNN model, without any data-specific hyperparameter tuning. In contrast, the same sleep intervention yields no measurable benefit for the surrogate-gradient spiking neural network (SG-SNN). Taken together, these results suggest that periodic, sleep-based renormalization may represent a fundamental mechanism for stabilizing local Hebbian learning in neuromorphic systems, while also indicating that special care is required when integrating such protocols with existing gradient-based optimization methods.
Related papers
- Multi-Scale Temporal Homeostasis Enables Efficient and Robust Neural Networks [0.0]
Multi-Scale Temporal Homeostasis (MSTH) is a framework that integrates ultra-fast (5-ms), fast (2-s), medium (5-min) and slow (1-hrs) regulation into artificial networks.<n>MSTH consistently improves accuracy, eliminates catastrophic failures and enhances recovery from perturbations.
arXiv Detail & Related papers (2026-01-30T13:33:29Z) - General Self-Prediction Enhancement for Spiking Neurons [71.01912385372577]
Spiking Neural Networks (SNNs) are highly energy-efficient due to event-driven, sparse computation, but their training is challenged by spike non-differentiability and trade-offs among performance, efficiency, and biological plausibility.<n>We propose a self-prediction enhanced spiking neuron method that generates an internal prediction current from its input-output history to modulate membrane potential.<n>This design offers dual advantages, it creates a continuous gradient path that alleviates vanishing gradients and boosts training stability and accuracy, while also aligning with biological principles, which resembles distal dendritic modulation and error-driven synaptic plasticity.
arXiv Detail & Related papers (2026-01-29T15:08:48Z) - ChronoPlastic Spiking Neural Networks [0.0]
Spiking neural networks (SNNs) offer a biologically grounded and energy-efficient alternative to conventional neural architectures.<n>CPSNNs embed temporal control directly within local synaptic dynamics.<n>CPSNNs learn long-gap temporal dependencies significantly faster and more reliably than standard SNN baselines.
arXiv Detail & Related papers (2025-12-17T06:58:04Z) - Unleashing Temporal Capacity of Spiking Neural Networks through Spatiotemporal Separation [67.69345363409835]
Spiking Neural Networks (SNNs) are considered naturally suited for temporal processing, with membrane potential propagation widely regarded as the core temporal modeling mechanism.<n>We design Non-Stateful (NS) models progressively removing membrane propagation to its stage-wise role. Experiments reveal a counterintuitive phenomenon: moderate removal in shallow layers improves performance, while excessive removal causes collapse.
arXiv Detail & Related papers (2025-12-05T07:05:53Z) - Neuronal Group Communication for Efficient Neural representation [85.36421257648294]
This paper addresses the question of how to build large neural systems that learn efficient, modular, and interpretable representations.<n>We propose Neuronal Group Communication (NGC), a theory-driven framework that reimagines a neural network as a dynamical system of interacting neuronal groups.<n>NGC treats weights as transient interactions between embedding-like neuronal states, with neural computation unfolding through iterative communication among groups of neurons.
arXiv Detail & Related papers (2025-10-19T14:23:35Z) - Training Deep Normalization-Free Spiking Neural Networks with Lateral Inhibition [52.59263087086756]
Training deep neural networks (SNNs) has critically depended on explicit normalization schemes, such as batch normalization.<n>We propose a normalization-free learning framework that incorporates lateral inhibition inspired by cortical circuits.<n>We show that our framework enables stable training of deep SNNs with biological realism and achieves competitive performance without resorting to explicit normalizations.
arXiv Detail & Related papers (2025-09-27T11:11:30Z) - Diverse Neural Sequences in QIF Networks: An Analytically Tractable Framework for Synfire Chains and Hippocampal Replay [0.0]
We propose a parsimonious network of Quadratic Integrate-and-Fire neurons with sequences embedded via a temporally asymmetric Hebbian rule.<n>Our findings demonstrate that this single framework robustly reproduces a spectrum of sequential activities, including persistent synfire-like chains and transient, hippocampal replay-like bursts exhibiting intra-ripple frequency accommodation (IFA)<n>These results establish QIF networks with TAH connectivity as an analytically tractable and biologically plausible platform for investigating the emergence, stability, and diversity of sequential neural activity in the brain.
arXiv Detail & Related papers (2025-08-08T07:27:47Z) - Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - Extending Spike-Timing Dependent Plasticity to Learning Synaptic Delays [50.45313162890861]
We introduce a novel learning rule for simultaneously learning synaptic connection strengths and delays.<n>We validate our approach by extending a widely-used SNN model for classification trained with unsupervised learning.<n>Results demonstrate that our proposed method consistently achieves superior performance across a variety of test scenarios.
arXiv Detail & Related papers (2025-06-17T21:24:58Z) - Learning with Spike Synchrony in Spiking Neural Networks [3.8506283985103447]
Spiking neural networks (SNNs) promise energy-efficient computation by mimicking biological neural dynamics.<n>We introduce spike-synchrony-dependent plasticity (SSDP), a training approach that adjusts synaptic weights based on the degree of neural firing rather than spike timing.
arXiv Detail & Related papers (2025-04-14T04:01:40Z) - Unconditional stability of a recurrent neural circuit implementing divisive normalization [0.0]
We prove the remarkable property of unconditional local stability for an arbitrary-dimensional ORGaNICs circuit.<n>We show that ORGaNICs can be trained by backpropagation through time without gradient clipping/scaling.
arXiv Detail & Related papers (2024-09-27T17:46:05Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.