Neural Architecture Search for Spiking Neural Networks
- URL: http://arxiv.org/abs/2201.10355v1
- Date: Sun, 23 Jan 2022 16:34:27 GMT
- Title: Neural Architecture Search for Spiking Neural Networks
- Authors: Youngeun Kim, Yuhang Li, Hyoungseob Park, Yeshwanth Venkatesha,
Priyadarshini Panda
- Abstract summary: Spiking Neural Networks (SNNs) have gained huge attention as a potential energy-efficient alternative to conventional Artificial Neural Networks (ANNs)
Most prior SNN methods use ANN-like architectures, which could provide sub-optimal performance for temporal sequence processing of binary information in SNNs.
We introduce a novel Neural Architecture Search (NAS) approach for finding better SNN architectures.
- Score: 10.303676184878896
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spiking Neural Networks (SNNs) have gained huge attention as a potential
energy-efficient alternative to conventional Artificial Neural Networks (ANNs)
due to their inherent high-sparsity activation. However, most prior SNN methods
use ANN-like architectures (e.g., VGG-Net or ResNet), which could provide
sub-optimal performance for temporal sequence processing of binary information
in SNNs. To address this, in this paper, we introduce a novel Neural
Architecture Search (NAS) approach for finding better SNN architectures.
Inspired by recent NAS approaches that find the optimal architecture from
activation patterns at initialization, we select the architecture that can
represent diverse spike activation patterns across different data samples
without training. Furthermore, to leverage the temporal correlation among the
spikes, we search for feed forward connections as well as backward connections
(i.e., temporal feedback connections) between layers. Interestingly, SNASNet
found by our search algorithm achieves higher performance with backward
connections, demonstrating the importance of designing SNN architecture for
suitably using temporal information. We conduct extensive experiments on three
image recognition benchmarks where we show that SNASNet achieves
state-of-the-art performance with significantly lower timesteps (5 timesteps).
Related papers
- Spatial-Temporal Search for Spiking Neural Networks [32.937536365872745]
Spiking Neural Networks (SNNs) are considered as a potential candidate for the next generation of artificial intelligence.
We propose a differentiable approach to optimize SNN on both spatial and temporal dimensions.
Our methods achieve comparable classification performance of CIFAR10/100 and ImageNet with accuracies of 96.43%, 78.96%, and 70.21%, respectively.
arXiv Detail & Related papers (2024-10-24T09:32:51Z) - Advancing Spiking Neural Networks towards Multiscale Spatiotemporal Interaction Learning [10.702093960098106]
Spiking Neural Networks (SNNs) serve as an energy-efficient alternative to Artificial Neural Networks (ANNs)
We have designed a Spiking Multiscale Attention (SMA) module that captures multiscaletemporal interaction information.
Our approach has achieved state-of-the-art results on mainstream neural datasets.
arXiv Detail & Related papers (2024-05-22T14:16:05Z) - Direct Training High-Performance Deep Spiking Neural Networks: A Review of Theories and Methods [33.377770671553336]
Spiking neural networks (SNNs) offer a promising energy-efficient alternative to artificial neural networks (ANNs)
In this paper, we provide a new perspective to summarize the theories and methods for training deep SNNs with high performance.
arXiv Detail & Related papers (2024-05-06T09:58:54Z) - Stochastic Spiking Neural Networks with First-to-Spike Coding [7.955633422160267]
Spiking Neural Networks (SNNs) are known for their bio-plausibility and energy efficiency.
In this work, we explore the merger of novel computing and information encoding schemes in SNN architectures.
We investigate the tradeoffs of our proposal in terms of accuracy, inference latency, spiking sparsity, energy consumption, and datasets.
arXiv Detail & Related papers (2024-04-26T22:52:23Z) - DCP-NAS: Discrepant Child-Parent Neural Architecture Search for 1-bit
CNNs [53.82853297675979]
1-bit convolutional neural networks (CNNs) with binary weights and activations show their potential for resource-limited embedded devices.
One natural approach is to use 1-bit CNNs to reduce the computation and memory cost of NAS.
We introduce Discrepant Child-Parent Neural Architecture Search (DCP-NAS) to efficiently search 1-bit CNNs.
arXiv Detail & Related papers (2023-06-27T11:28:29Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - Keys to Accurate Feature Extraction Using Residual Spiking Neural
Networks [1.101002667958165]
Spiking neural networks (SNNs) have become an interesting alternative to conventional artificial neural networks (ANNs)
We present a study on the key components of modern spiking architectures.
We design a spiking version of the successful residual network (ResNet) architecture and test different components and training strategies on it.
arXiv Detail & Related papers (2021-11-10T21:29:19Z) - Neural Architecture Search of SPD Manifold Networks [79.45110063435617]
We propose a new neural architecture search (NAS) problem of Symmetric Positive Definite (SPD) manifold networks.
We first introduce a geometrically rich and diverse SPD neural architecture search space for an efficient SPD cell design.
We exploit a differentiable NAS algorithm on our relaxed continuous search space for SPD neural architecture search.
arXiv Detail & Related papers (2020-10-27T18:08:57Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - A Semi-Supervised Assessor of Neural Architectures [157.76189339451565]
We employ an auto-encoder to discover meaningful representations of neural architectures.
A graph convolutional neural network is introduced to predict the performance of architectures.
arXiv Detail & Related papers (2020-05-14T09:02:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.