Learning with Spike Synchrony in Spiking Neural Networks
- URL: http://arxiv.org/abs/2505.14841v2
- Date: Mon, 25 Aug 2025 07:25:09 GMT
- Title: Learning with Spike Synchrony in Spiking Neural Networks
- Authors: Yuchen Tian, Assel Kembay, Samuel Tensingh, Nhan Duy Truong, Jason K. Eshraghian, Omid Kavehei,
- Abstract summary: Spiking neural networks (SNNs) promise energy-efficient computation by mimicking biological neural dynamics.<n>We introduce spike-synchrony-dependent plasticity (SSDP), a training approach that adjusts synaptic weights based on the degree of neural firing rather than spike timing.
- Score: 3.8506283985103447
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spiking neural networks (SNNs) promise energy-efficient computation by mimicking biological neural dynamics, yet existing plasticity rules focus on isolated spike pairs and fail to leverage the synchronous activity patterns that drive learning in biological systems. We introduce spike-synchrony-dependent plasticity (SSDP), a training approach that adjusts synaptic weights based on the degree of synchronous neural firing rather than spike timing order. Our method operates as a local, post-optimization mechanism that applies updates to sparse parameter subsets, maintaining computational efficiency with linear scaling. SSDP serves as a lightweight event-structure regularizer, biasing the network toward biologically plausible spatio-temporal synchrony while preserving standard convergence behavior. SSDP seamlessly integrates with standard backpropagation while preserving the forward computation graph. We validate our approach across single-layer SNNs and spiking Transformers on datasets from static images to high-temporal-resolution tasks, demonstrating improved convergence stability and enhanced robustness to spike-time jitter and event noise. These findings provide new insights into how biological neural networks might leverage synchronous activity for efficient information processing and suggest that synchrony-dependent plasticity represents a key computational principle underlying neural learning.
Related papers
- ChronoPlastic Spiking Neural Networks [0.0]
Spiking neural networks (SNNs) offer a biologically grounded and energy-efficient alternative to conventional neural architectures.<n>CPSNNs embed temporal control directly within local synaptic dynamics.<n>CPSNNs learn long-gap temporal dependencies significantly faster and more reliably than standard SNN baselines.
arXiv Detail & Related papers (2025-12-17T06:58:04Z) - Neuronal Group Communication for Efficient Neural representation [85.36421257648294]
This paper addresses the question of how to build large neural systems that learn efficient, modular, and interpretable representations.<n>We propose Neuronal Group Communication (NGC), a theory-driven framework that reimagines a neural network as a dynamical system of interacting neuronal groups.<n>NGC treats weights as transient interactions between embedding-like neuronal states, with neural computation unfolding through iterative communication among groups of neurons.
arXiv Detail & Related papers (2025-10-19T14:23:35Z) - Fractional Spike Differential Equations Neural Network with Efficient Adjoint Parameters Training [63.3991315762955]
Spiking Neural Networks (SNNs) draw inspiration from biological neurons to create realistic models for brain-like computation.<n>Most existing SNNs assume a single time constant for neuronal membrane voltage dynamics, modeled by first-order ordinary differential equations (ODEs) with Markovian characteristics.<n>We propose the Fractional SPIKE Differential Equation neural network (fspikeDE), which captures long-term dependencies in membrane voltage and spike trains through fractional-order dynamics.
arXiv Detail & Related papers (2025-07-22T18:20:56Z) - Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - Neuromorphic Wireless Split Computing with Resonate-and-Fire Neurons [69.73249913506042]
This paper investigates a wireless split computing architecture that employs resonate-and-fire (RF) neurons to process time-domain signals directly.<n>By resonating at tunable frequencies, RF neurons extract time-localized spectral features while maintaining low spiking activity.<n> Experimental results show that the proposed RF-SNN architecture achieves comparable accuracy to conventional LIF-SNNs and ANNs.
arXiv Detail & Related papers (2025-06-24T21:14:59Z) - Extending Spike-Timing Dependent Plasticity to Learning Synaptic Delays [50.45313162890861]
We introduce a novel learning rule for simultaneously learning synaptic connection strengths and delays.<n>We validate our approach by extending a widely-used SNN model for classification trained with unsupervised learning.<n>Results demonstrate that our proposed method consistently achieves superior performance across a variety of test scenarios.
arXiv Detail & Related papers (2025-06-17T21:24:58Z) - Neural Models of Task Adaptation: A Tutorial on Spiking Networks for Executive Control [0.0]
This tutorial presents a step-by-step approach to constructing a spiking neural network (SNN) that simulates task-switching dynamics.<n>The model incorporates biologically realistic features, including lateral inhibition, adaptive synaptic weights, and precise parameterization within physiologically relevant ranges.<n>By following this tutorial, researchers can develop and extend biologically inspired SNN models for studying cognitive processes and neural adaptation.
arXiv Detail & Related papers (2025-03-05T00:44:34Z) - TESS: A Scalable Temporally and Spatially Local Learning Rule for Spiking Neural Networks [6.805933498669221]
Training neural networks (SNNs) on resource-constrained devices remains challenging due to high computational and memory demands.<n>We introduce TESS, a temporally and spatially local learning rule for training SNNs.<n>Our approach addresses both temporal and spatial credit assignments by relying solely on locally available signals within each neuron.
arXiv Detail & Related papers (2025-02-03T21:23:15Z) - Deep-Unrolling Multidimensional Harmonic Retrieval Algorithms on Neuromorphic Hardware [78.17783007774295]
This paper explores the potential of conversion-based neuromorphic algorithms for highly accurate and energy-efficient single-snapshot multidimensional harmonic retrieval.<n>A novel method for converting the complex-valued convolutional layers and activations into spiking neural networks (SNNs) is developed.<n>The converted SNNs achieve almost five-fold power efficiency at moderate performance loss compared to the original CNNs.
arXiv Detail & Related papers (2024-12-05T09:41:33Z) - Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Advancing Spatio-Temporal Processing in Spiking Neural Networks through Adaptation [6.233189707488025]
neural networks on neuromorphic hardware promise orders of less power consumption than their non-spiking counterparts.<n>Standard neuron model for spike-based computation on such systems has long been the integrate-and-fire (LIF) neuron.<n>The root of these so-called adaptive LIF neurons is not well understood.
arXiv Detail & Related papers (2024-08-14T12:49:58Z) - Learning Delays Through Gradients and Structure: Emergence of Spatiotemporal Patterns in Spiking Neural Networks [0.06752396542927405]
We present a Spiking Neural Network (SNN) model that incorporates learnable synaptic delays through two approaches.
In the latter approach, the network selects and prunes connections, optimizing the delays in sparse connectivity settings.
Our results demonstrate the potential of combining delay learning with dynamic pruning to develop efficient SNN models for temporal data processing.
arXiv Detail & Related papers (2024-07-07T11:55:48Z) - Synchronized Stepwise Control of Firing and Learning Thresholds in a Spiking Randomly Connected Neural Network toward Hardware Implementation [0.0]
We propose hardware-oriented models of intrinsic plasticity (IP) and synaptic plasticity (SP) for spiking randomly connected neural network (RNN)
We demonstrate the effectiveness of our model through simulations of temporal data learning and anomaly detection with a spiking RNN using publicly available electrocardiograms.
arXiv Detail & Related papers (2024-04-26T08:26:10Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Developmental Plasticity-inspired Adaptive Pruning for Deep Spiking and Artificial Neural Networks [11.730984231143108]
Developmental plasticity plays prominent role in shaping the brain's structure during ongoing learning.
Existing network compression methods for deep artificial neural networks (ANNs) and spiking neural networks (SNNs) draw little inspiration from brain's developmental plasticity mechanisms.
This paper proposes a developmental plasticity-inspired adaptive pruning (DPAP) method, with inspiration from the adaptive developmental pruning of dendritic spines, synapses, and neurons.
arXiv Detail & Related papers (2022-11-23T05:26:51Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Modeling Associative Plasticity between Synapses to Enhance Learning of
Spiking Neural Networks [4.736525128377909]
Spiking Neural Networks (SNNs) are the third generation of artificial neural networks that enable energy-efficient implementation on neuromorphic hardware.
We propose a robust and effective learning mechanism by modeling the associative plasticity between synapses.
Our approaches achieve superior performance on static and state-of-the-art neuromorphic datasets.
arXiv Detail & Related papers (2022-07-24T06:12:23Z) - Increasing Liquid State Machine Performance with Edge-of-Chaos Dynamics
Organized by Astrocyte-modulated Plasticity [0.0]
Liquid state machine (LSM) tunes internal weights without backpropagation of gradients.
Recent findings suggest that astrocytes, a long-neglected non-neuronal brain cell, modulate synaptic plasticity and brain dynamics.
We propose the neuron-astrocyte liquid state machine (NALSM) that addresses under-performance through self-organized near-critical dynamics.
arXiv Detail & Related papers (2021-10-26T23:04:40Z) - SpikePropamine: Differentiable Plasticity in Spiking Neural Networks [0.0]
We introduce a framework for learning the dynamics of synaptic plasticity and neuromodulated synaptic plasticity in Spiking Neural Networks (SNNs)
We show that SNNs augmented with differentiable plasticity are sufficient for solving a set of challenging temporal learning tasks.
These networks are also shown to be capable of producing locomotion on a high-dimensional robotic learning task.
arXiv Detail & Related papers (2021-06-04T19:29:07Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.