Keys to Accurate Feature Extraction Using Residual Spiking Neural
Networks
- URL: http://arxiv.org/abs/2111.05955v2
- Date: Fri, 12 Nov 2021 09:26:30 GMT
- Title: Keys to Accurate Feature Extraction Using Residual Spiking Neural
Networks
- Authors: Alex Vicente-Sola, Davide L. Manna, Paul Kirkland, Gaetano Di
Caterina, Trevor Bihl
- Abstract summary: Spiking neural networks (SNNs) have become an interesting alternative to conventional artificial neural networks (ANNs)
We present a study on the key components of modern spiking architectures.
We design a spiking version of the successful residual network (ResNet) architecture and test different components and training strategies on it.
- Score: 1.101002667958165
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Spiking neural networks (SNNs) have become an interesting alternative to
conventional artificial neural networks (ANN) thanks to their temporal
processing capabilities and their low-SWaP (Size, Weight, and Power) and energy
efficient implementations in neuromorphic hardware. However the challenges
involved in training SNNs have limited their performance in terms of accuracy
and thus their applications. Improving learning algorithms and neural
architectures for a more accurate feature extraction is therefore one of the
current priorities in SNN research. In this paper we present a study on the key
components of modern spiking architectures. We empirically compare different
techniques in image classification datasets taken from the best performing
networks. We design a spiking version of the successful residual network
(ResNet) architecture and test different components and training strategies on
it. Our results provide a state of the art guide to SNN design, which allows to
make informed choices when trying to build the optimal visual feature
extractor. Finally, our network outperforms previous SNN architectures in
CIFAR-10 (94.1%) and CIFAR-100 (74.5%) datasets and matches the state of the
art in DVS-CIFAR10 (71.3%), with less parameters than the previous state of the
art and without the need for ANN-SNN conversion. Code available at
https://github.com/VicenteAlex/Spiking_ResNet.
Related papers
- NAS-BNN: Neural Architecture Search for Binary Neural Networks [55.058512316210056]
We propose a novel neural architecture search scheme for binary neural networks, named NAS-BNN.
Our discovered binary model family outperforms previous BNNs for a wide range of operations (OPs) from 20M to 200M.
In addition, we validate the transferability of these searched BNNs on the object detection task, and our binary detectors with the searched BNNs achieve a novel state-of-the-art result, e.g., 31.6% mAP with 370M OPs, on MS dataset.
arXiv Detail & Related papers (2024-08-28T02:17:58Z) - Unveiling the Power of Sparse Neural Networks for Feature Selection [60.50319755984697]
Sparse Neural Networks (SNNs) have emerged as powerful tools for efficient feature selection.
We show that SNNs trained with dynamic sparse training (DST) algorithms can achieve, on average, more than $50%$ memory and $55%$ FLOPs reduction.
Our findings show that feature selection with SNNs trained with DST algorithms can achieve, on average, more than $50%$ memory and $55%$ FLOPs reduction.
arXiv Detail & Related papers (2024-08-08T16:48:33Z) - Investigating Sparsity in Recurrent Neural Networks [0.0]
This thesis focuses on investigating the effects of pruning and Sparse Recurrent Neural Networks on the performance of RNNs.
We first describe the pruning of RNNs, its impact on the performance of RNNs, and the number of training epochs required to regain accuracy after the pruning is performed.
Next, we continue with the creation and training of Sparse Recurrent Neural Networks and identify the relation between the performance and the graph properties of its underlying arbitrary structure.
arXiv Detail & Related papers (2024-07-30T07:24:58Z) - Optimizing Convolutional Neural Network Architecture [0.0]
Convolutional Neural Networks (CNN) are widely used to face challenging tasks like speech recognition, natural language processing or computer vision.
We propose Optimizing Convolutional Neural Network Architecture (OCNNA), a novel CNN optimization and construction method based on pruning and knowledge distillation.
Our method has been compared with more than 20 convolutional neural network simplification algorithms obtaining outstanding results.
arXiv Detail & Related papers (2023-12-17T12:23:11Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Neural Architecture Search for Spiking Neural Networks [10.303676184878896]
Spiking Neural Networks (SNNs) have gained huge attention as a potential energy-efficient alternative to conventional Artificial Neural Networks (ANNs)
Most prior SNN methods use ANN-like architectures, which could provide sub-optimal performance for temporal sequence processing of binary information in SNNs.
We introduce a novel Neural Architecture Search (NAS) approach for finding better SNN architectures.
arXiv Detail & Related papers (2022-01-23T16:34:27Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - Going Deeper With Directly-Trained Larger Spiking Neural Networks [20.40894876501739]
Spiking neural networks (SNNs) are promising in coding for bio-usible information and event-driven signal processing.
However, the unique working mode of SNNs makes them more difficult to train than traditional networks.
We propose a CIF-dependent batch normalization (tpladBN) method based on the emerging-temporal backproation threshold.
arXiv Detail & Related papers (2020-10-29T07:15:52Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Convolutional Spiking Neural Networks for Spatio-Temporal Feature
Extraction [3.9898522485253256]
Spiking neural networks (SNNs) can be used in low-power and embedded systems.
temporal coding in layers of convolutional neural networks and other types of SNNs has yet to be studied.
We present a new deep spiking architecture to tackle real-world problems.
arXiv Detail & Related papers (2020-03-27T11:58:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.