SDSNN: A Single-Timestep Spiking Neural Network with Self-Dropping Neuron and Bayesian Optimization
- URL: http://arxiv.org/abs/2508.10913v1
- Date: Fri, 01 Aug 2025 03:41:47 GMT
- Title: SDSNN: A Single-Timestep Spiking Neural Network with Self-Dropping Neuron and Bayesian Optimization
- Authors: Changqing Xu, Buxuan Song, Yi Liu, Xinfang Liao, Wenbin Zheng, Yintang Yang,
- Abstract summary: Spiking Neural Networks (SNNs) are an emerging biologically inspired computational model.<n>SNNs transmit information through discrete spike signals, which substantially reduces computational energy consumption.<n>We propose a single-timestep SNN, which enhances accuracy and reduces computational energy consumption in a single timestep.
- Score: 3.939441643960418
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking Neural Networks (SNNs), as an emerging biologically inspired computational model, demonstrate significant energy efficiency advantages due to their event-driven information processing mechanism. Compared to traditional Artificial Neural Networks (ANNs), SNNs transmit information through discrete spike signals, which substantially reduces computational energy consumption through their sparse encoding approach. However, the multi-timestep computation model significantly increases inference latency and energy, limiting the applicability of SNNs in edge computing scenarios. We propose a single-timestep SNN, which enhances accuracy and reduces computational energy consumption in a single timestep by optimizing spike generation and temporal parameters. We design a Self-Dropping Neuron mechanism, which enhances information-carrying capacity through dynamic threshold adjustment and selective spike suppression. Furthermore, we employ Bayesian optimization to globally search for time parameters and obtain an efficient inference mode with a single time step. Experimental results on the Fashion-MNIST, CIFAR-10, and CIFAR-100 datasets demonstrate that, compared to traditional multi-timestep SNNs employing the Leaky Integrate-and-Fire (LIF) model, our method achieves classification accuracies of 93.72%, 92.20%, and 69.45%, respectively, using only single-timestep spikes, while maintaining comparable or even superior accuracy. Additionally, it reduces energy consumption by 56%, 21%, and 22%, respectively.
Related papers
- Fractional Spike Differential Equations Neural Network with Efficient Adjoint Parameters Training [63.3991315762955]
Spiking Neural Networks (SNNs) draw inspiration from biological neurons to create realistic models for brain-like computation.<n>Most existing SNNs assume a single time constant for neuronal membrane voltage dynamics, modeled by first-order ordinary differential equations (ODEs) with Markovian characteristics.<n>We propose the Fractional SPIKE Differential Equation neural network (fspikeDE), which captures long-term dependencies in membrane voltage and spike trains through fractional-order dynamics.
arXiv Detail & Related papers (2025-07-22T18:20:56Z) - TS-SNN: Temporal Shift Module for Spiking Neural Networks [12.35332483263129]
Spiking Neural Networks (SNNs) are recognized for their biological plausibility and energy efficiency.<n>We introduce Temporal Shift module for Spiking Neural Networks (TS-SNN), which incorporates a novel Temporal Shift (TS) module to integrate past, present, and future spike features.<n>TS-SNN achieves state-of-the-art performance on benchmarks like CIFAR-10 (96.72%), CIFAR-100 (80.28%), and ImageNet (70.61%) with fewer timesteps, while maintaining low energy consumption.
arXiv Detail & Related papers (2025-05-07T06:34:34Z) - Scalable Mechanistic Neural Networks for Differential Equations and Machine Learning [52.28945097811129]
We propose an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences.<n>We reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear.<n>Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources.
arXiv Detail & Related papers (2024-10-08T14:27:28Z) - Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks [50.32980443749865]
Spiking neural networks (SNNs) have garnered significant attention for their low power consumption and high biologicalability.
Current SNNs struggle to balance accuracy and latency in neuromorphic datasets.
We propose Step-wise Distillation (HSD) method, tailored for neuromorphic datasets.
arXiv Detail & Related papers (2024-09-19T06:52:34Z) - Direct Training Needs Regularisation: Anytime Optimal Inference Spiking Neural Network [23.434563009813218]
Spiking Neural Network (SNN) is acknowledged as the next generation of Artificial Neural Network (ANN)
We introduce a novel regularisation technique, namely Spatial-Temporal Regulariser (STR)
STR regulates the ratio between the strength of spikes and membrane potential at each timestep.
This effectively balances spatial and temporal performance during training, ultimately resulting in an Anytime Optimal Inference (AOI) SNN.
arXiv Detail & Related papers (2024-04-15T15:57:01Z) - Efficient and Effective Time-Series Forecasting with Spiking Neural Networks [47.371024581669516]
Spiking neural networks (SNNs) provide a unique pathway for capturing the intricacies of temporal data.
Applying SNNs to time-series forecasting is challenging due to difficulties in effective temporal alignment, complexities in encoding processes, and the absence of standardized guidelines for model selection.
We propose a framework for SNNs in time-series forecasting tasks, leveraging the efficiency of spiking neurons in processing temporal information.
arXiv Detail & Related papers (2024-02-02T16:23:50Z) - LitE-SNN: Designing Lightweight and Efficient Spiking Neural Network through Spatial-Temporal Compressive Network Search and Joint Optimization [48.41286573672824]
Spiking Neural Networks (SNNs) mimic the information-processing mechanisms of the human brain and are highly energy-efficient.
We propose a new approach named LitE-SNN that incorporates both spatial and temporal compression into the automated network design process.
arXiv Detail & Related papers (2024-01-26T05:23:11Z) - Fully Spiking Denoising Diffusion Implicit Models [61.32076130121347]
Spiking neural networks (SNNs) have garnered considerable attention owing to their ability to run on neuromorphic devices with super-high speeds.
We propose a novel approach fully spiking denoising diffusion implicit model (FSDDIM) to construct a diffusion model within SNNs.
We demonstrate that the proposed method outperforms the state-of-the-art fully spiking generative model.
arXiv Detail & Related papers (2023-12-04T09:07:09Z) - Input-Aware Dynamic Timestep Spiking Neural Networks for Efficient
In-Memory Computing [7.738130109655604]
Spiking Neural Networks (SNNs) have attracted widespread research interest because of their capability to process sparse and binary spike information.
We show that the energy cost and latency of SNNs scale linearly with the number of timesteps used on IMC hardware.
We propose input-aware Dynamic Timestep SNN (DT-SNN) to maximize the efficiency of SNNs.
arXiv Detail & Related papers (2023-05-27T03:01:27Z) - TopSpark: A Timestep Optimization Methodology for Energy-Efficient
Spiking Neural Networks on Autonomous Mobile Agents [14.916996986290902]
Spiking Neural Networks (SNNs) offer low power/energy processing due to sparse computations and efficient online learning.
TopSpark is a novel methodology that leverages adaptive timestep reduction to enable energy-efficient SNN processing in both training and inference.
arXiv Detail & Related papers (2023-03-03T10:20:45Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Adaptive-SpikeNet: Event-based Optical Flow Estimation using Spiking
Neural Networks with Learnable Neuronal Dynamics [6.309365332210523]
Spiking Neural Networks (SNNs) with their neuro-inspired event-driven processing can efficiently handle asynchronous data.
We propose an adaptive fully-spiking framework with learnable neuronal dynamics to alleviate the spike vanishing problem.
Our experiments on datasets show an average reduction of 13% in average endpoint error (AEE) compared to state-of-the-art ANNs.
arXiv Detail & Related papers (2022-09-21T21:17:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.