Sparse Axonal and Dendritic Delays Enable Competitive SNNs for Keyword Classification
- URL: http://arxiv.org/abs/2602.09746v1
- Date: Tue, 10 Feb 2026 12:57:02 GMT
- Title: Sparse Axonal and Dendritic Delays Enable Competitive SNNs for Keyword Classification
- Authors: Younes Bouhadjar, Emre Neftci,
- Abstract summary: Training transmission delays in spiking neural networks (SNNs) has been shown to substantially improve their performance on complex temporal tasks.<n>We show that learning either axonal or dendritic delays enables deep feedforward SNNs to reach accuracy comparable to existing synaptic delay learning approaches.
- Score: 5.928605435529651
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Training transmission delays in spiking neural networks (SNNs) has been shown to substantially improve their performance on complex temporal tasks. In this work, we show that learning either axonal or dendritic delays enables deep feedforward SNNs composed of leaky integrate-and-fire (LIF) neurons to reach accuracy comparable to existing synaptic delay learning approaches, while significantly reducing memory and computational overhead. SNN models with either axonal or dendritic delays achieve up to $95.58\%$ on the Google Speech Command (GSC) and $80.97\%$ on the Spiking Speech Command (SSC) datasets, matching or exceeding prior methods based on synaptic delays or more complex neuron models. By adjusting the delay parameters, we obtain improved performance for synaptic delay learning baselines, strengthening the comparison. We find that axonal delays offer the most favorable trade-off, combining lower buffering requirements with slightly higher accuracy than dendritic delays. We further show that the performance of axonal and dendritic delay models is largely preserved under strong delay sparsity, with as few as $20\%$ of delays remaining active, further reducing buffering requirements. Overall, our results indicate that learnable axonal and dendritic delays provide a resource-efficient and effective mechanism for temporal representation in SNNs. Code will be made available publicly upon acceptance. Code is available at https://github.com/YounesBouhadjar/AxDenSynDelaySNN
Related papers
- Delays in Spiking Neural Networks: A State Space Model Approach [2.309307613420651]
Spiking neural networks (SNNs) are biologically inspired, event-driven models suitable for processing temporal data.<n>We propose a general framework for incorporating delays into SNNs through additional state variables.<n>We show that the proposed mechanism matches the performance of existing delay-based SNNs while remaining computationally efficient.
arXiv Detail & Related papers (2025-12-01T17:26:21Z) - DelRec: learning delays in recurrent spiking neural networks [44.373421535679476]
Spiking neural networks (SNNs) are a bio-inspired alternative to conventional real-valued deep learning models.<n>DelRec is the first SGL-based method to train axonal or synaptic delays in recurrent spiking layers.<n>Our results demonstrate that recurrent delays are critical for temporal processing in SNNs.
arXiv Detail & Related papers (2025-09-29T14:38:57Z) - Extending Spike-Timing Dependent Plasticity to Learning Synaptic Delays [50.45313162890861]
We introduce a novel learning rule for simultaneously learning synaptic connection strengths and delays.<n>We validate our approach by extending a widely-used SNN model for classification trained with unsupervised learning.<n>Results demonstrate that our proposed method consistently achieves superior performance across a variety of test scenarios.
arXiv Detail & Related papers (2025-06-17T21:24:58Z) - Efficient Event-based Delay Learning in Spiking Neural Networks [0.1350479308585481]
Spiking Neural Networks (SNNs) compute using sparse communication and are attracting increased attention.<n>We propose a novel event-based training method for SNNs with delays, grounded in the EventProp formalism.<n>Our method supports multiple spikes per neuron and, to the best of our knowledge, is the first delay learning algorithm to be applied to recurrent SNNs.
arXiv Detail & Related papers (2025-01-13T13:44:34Z) - Temporal Spiking Neural Networks with Synaptic Delay for Graph Reasoning [91.29876772547348]
Spiking neural networks (SNNs) are investigated as biologically inspired models of neural computation.
This paper reveals that SNNs, when amalgamated with synaptic delay and temporal coding, are proficient in executing (knowledge) graph reasoning.
arXiv Detail & Related papers (2024-05-27T05:53:30Z) - Stochastic Approximation with Delayed Updates: Finite-Time Rates under Markovian Sampling [73.5602474095954]
We study the non-asymptotic performance of approximation schemes with delayed updates under Markovian sampling.
Our theoretical findings shed light on the finite-time effects of delays for a broad class of algorithms.
arXiv Detail & Related papers (2024-02-19T03:08:02Z) - Learning Delays in Spiking Neural Networks using Dilated Convolutions
with Learnable Spacings [1.534667887016089]
Spiking Neural Networks (SNNs) are promising research direction for building power-efficient information processing systems.
In SNNs, delays refer to the time needed for one spike to travel from one neuron to another.
It has been shown theoretically that plastic delays greatly increase the expressivity in SNNs.
We propose a new discrete-time algorithm that addresses this issue in deep feedforward SNNs using backpropagation.
arXiv Detail & Related papers (2023-06-30T14:01:53Z) - Adaptive Axonal Delays in feedforward spiking neural networks for
accurate spoken word recognition [4.018601183900039]
Spiking neural networks (SNN) are a promising research avenue for building accurate and efficient automatic speech recognition systems.
Recent advances in audio-to-spike encoding and training algorithms enable SNN to be applied in practical tasks.
Our work illustrates the potential of training axonal delays for tasks with complex temporal structures.
arXiv Detail & Related papers (2023-02-16T22:19:04Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.