Spiking Neural Network Architecture Search: A Survey
- URL: http://arxiv.org/abs/2510.14235v1
- Date: Thu, 16 Oct 2025 02:27:07 GMT
- Title: Spiking Neural Network Architecture Search: A Survey
- Authors: Kama Svoboda, Tosiron Adegbija,
- Abstract summary: Spiking Neural Network architecture search (SNNaS) is a promising approach to neuromorphic computing.<n>SNNaS offers significant advantages in terms of power efficiency and real-time resource-constrained processing.<n>However, designing optimal SNN architectures poses significant challenges, due to their inherent complexity.
- Score: 4.082216579462797
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This survey paper presents a comprehensive examination of Spiking Neural Network (SNN) architecture search (SNNaS) from a unique hardware/software co-design perspective. SNNs, inspired by biological neurons, have emerged as a promising approach to neuromorphic computing. They offer significant advantages in terms of power efficiency and real-time resource-constrained processing, making them ideal for edge computing and IoT applications. However, designing optimal SNN architectures poses significant challenges, due to their inherent complexity (e.g., with respect to training) and the interplay between hardware constraints and SNN models. We begin by providing an overview of SNNs, emphasizing their operational principles and key distinctions from traditional artificial neural networks (ANNs). We then provide a brief overview of the state of the art in NAS for ANNs, highlighting the challenges of directly applying these approaches to SNNs. We then survey the state-of-the-art in SNN-specific NAS approaches. Finally, we conclude with insights into future research directions for SNN research, emphasizing the potential of hardware/software co-design in unlocking the full capabilities of SNNs. This survey aims to serve as a valuable resource for researchers and practitioners in the field, offering a holistic view of SNNaS and underscoring the importance of a co-design approach to harness the true potential of neuromorphic computing.
Related papers
- S$^2$NN: Sub-bit Spiking Neural Networks [53.08060832135342]
Spiking Neural Networks (SNNs) offer an energy-efficient paradigm for machine intelligence.<n>Despite recent advances in binary SNNs, the storage and computational demands remain substantial for large-scale networks.<n>We propose Sub-bit Spiking Neural Networks (S$2$NNs) that represent weights with less than one bit.
arXiv Detail & Related papers (2025-09-29T04:17:44Z) - A Self-Ensemble Inspired Approach for Effective Training of Binary-Weight Spiking Neural Networks [66.80058515743468]
Training Spiking Neural Networks (SNNs) and Binary Neural Networks (BNNs) is challenging because of the non-differentiable spike generation function.<n>We present a novel perspective on the dynamics of SNNs and their close connection to BNNs through an analysis of the backpropagation process.<n>Specifically, we leverage a structure of multiple shortcuts and a knowledge distillation-based training technique to improve the training of (binary-weight) SNNs.
arXiv Detail & Related papers (2025-08-18T04:11:06Z) - Edge Intelligence with Spiking Neural Networks [50.33340747216377]
Spiking Neural Networks (SNNs) offer low-power, event-driven computation on resource-constrained devices.<n>We present a systematic taxonomy of EdgeSNN foundations, encompassing neuron models, learning algorithms, and supporting hardware platforms.<n>Three representative practical considerations of EdgeSNN are discussed in depth: on-device inference using lightweight SNN models, resource-aware training and updating under non-stationary data conditions, and secure and privacy-preserving issues.
arXiv Detail & Related papers (2025-07-18T16:47:52Z) - NAS-BNN: Neural Architecture Search for Binary Neural Networks [55.058512316210056]
We propose a novel neural architecture search scheme for binary neural networks, named NAS-BNN.
Our discovered binary model family outperforms previous BNNs for a wide range of operations (OPs) from 20M to 200M.
In addition, we validate the transferability of these searched BNNs on the object detection task, and our binary detectors with the searched BNNs achieve a novel state-of-the-art result, e.g., 31.6% mAP with 370M OPs, on MS dataset.
arXiv Detail & Related papers (2024-08-28T02:17:58Z) - Direct Training High-Performance Deep Spiking Neural Networks: A Review of Theories and Methods [33.377770671553336]
Spiking neural networks (SNNs) offer a promising energy-efficient alternative to artificial neural networks (ANNs)
In this paper, we provide a new perspective to summarize the theories and methods for training deep SNNs with high performance.
arXiv Detail & Related papers (2024-05-06T09:58:54Z) - Harnessing Neuron Stability to Improve DNN Verification [42.65507402735545]
We present VeriStable, a novel extension of recently proposed DPLL-based constraint DNN verification approach.
We evaluate the effectiveness of VeriStable across a range of challenging benchmarks including fully-connected feed networks (FNNs), convolutional neural networks (CNNs) and residual networks (ResNets)
Preliminary results show that VeriStable is competitive and outperforms state-of-the-art verification tools, including $alpha$-$beta$-CROWN and MN-BaB, the first and second performers of the VNN-COMP, respectively.
arXiv Detail & Related papers (2024-01-19T23:48:04Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Exploiting Noise as a Resource for Computation and Learning in Spiking
Neural Networks [32.0086664373154]
This study introduces the noisy spiking neural network (NSNN) and the noise-driven learning rule (NDL)
NSNN provides a theoretical framework that yields scalable, flexible, and reliable computation.
arXiv Detail & Related papers (2023-05-25T13:21:26Z) - Uncovering the Representation of Spiking Neural Networks Trained with
Surrogate Gradient [11.0542573074431]
Spiking Neural Networks (SNNs) are recognized as the candidate for the next-generation neural networks due to their bio-plausibility and energy efficiency.
Recently, researchers have demonstrated that SNNs are able to achieve nearly state-of-the-art performance in image recognition tasks using surrogate gradient training.
arXiv Detail & Related papers (2023-04-25T19:08:29Z) - Spiking Neural Networks for Visual Place Recognition via Weighted
Neuronal Assignments [24.754429120321365]
Spiking neural networks (SNNs) offer both compelling potential advantages, including energy efficiency and low latencies.
One promising area for high performance SNNs is template matching and image recognition.
This research introduces the first high performance SNN for the Visual Place Recognition (VPR) task.
arXiv Detail & Related papers (2021-09-14T05:40:40Z) - Search to aggregate neighborhood for graph neural network [47.47628113034479]
We propose a framework, which tries to Search to Aggregate NEighborhood (SANE) to automatically design data-specific GNN architectures.
By designing a novel and expressive search space, we propose a differentiable search algorithm, which is more efficient than previous reinforcement learning based methods.
arXiv Detail & Related papers (2021-04-14T03:15:19Z) - Spiking Neural Networks -- Part I: Detecting Spatial Patterns [38.518936229794214]
Spiking Neural Networks (SNNs) are biologically inspired machine learning models that build on dynamic neuronal models processing binary and sparse spiking signals in an event-driven, online, fashion.
SNNs can be implemented on neuromorphic computing platforms that are emerging as energy-efficient co-processors for learning and inference.
arXiv Detail & Related papers (2020-10-27T11:37:22Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - SiamSNN: Siamese Spiking Neural Networks for Energy-Efficient Object
Tracking [20.595208488431766]
SiamSNN is the first deep SNN tracker that achieves short latency and low precision loss on the visual object tracking benchmarks OTB2013, VOT2016, and GOT-10k.
SiamSNN notably achieves low energy consumption and real-time on Neuromorphic chip TrueNorth.
arXiv Detail & Related papers (2020-03-17T08:49:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.