Convolutional Spiking Neural Networks for Spatio-Temporal Feature
Extraction
- URL: http://arxiv.org/abs/2003.12346v2
- Date: Mon, 18 Jan 2021 00:23:29 GMT
- Title: Convolutional Spiking Neural Networks for Spatio-Temporal Feature
Extraction
- Authors: Ali Samadzadeh, Fatemeh Sadat Tabatabaei Far, Ali Javadi, Ahmad
Nickabadi, Morteza Haghir Chehreghani
- Abstract summary: Spiking neural networks (SNNs) can be used in low-power and embedded systems.
temporal coding in layers of convolutional neural networks and other types of SNNs has yet to be studied.
We present a new deep spiking architecture to tackle real-world problems.
- Score: 3.9898522485253256
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking neural networks (SNNs) can be used in low-power and embedded systems
(such as emerging neuromorphic chips) due to their event-based nature. Also,
they have the advantage of low computation cost in contrast to conventional
artificial neural networks (ANNs), while preserving ANN's properties. However,
temporal coding in layers of convolutional spiking neural networks and other
types of SNNs has yet to be studied. In this paper, we provide insight into
spatio-temporal feature extraction of convolutional SNNs in experiments
designed to exploit this property. The shallow convolutional SNN outperforms
state-of-the-art spatio-temporal feature extractor methods such as C3D,
ConvLstm, and similar networks. Furthermore, we present a new deep spiking
architecture to tackle real-world problems (in particular classification tasks)
which achieved superior performance compared to other SNN methods on NMNIST
(99.6%), DVS-CIFAR10 (69.2%) and DVS-Gesture (96.7%) and ANN methods on UCF-101
(42.1%) and HMDB-51 (21.5%) datasets. It is also worth noting that the training
process is implemented based on variation of spatio-temporal backpropagation
explained in the paper.
Related papers
- Spatial-Temporal Search for Spiking Neural Networks [32.937536365872745]
Spiking Neural Networks (SNNs) are considered as a potential candidate for the next generation of artificial intelligence.
We propose a differentiable approach to optimize SNN on both spatial and temporal dimensions.
Our methods achieve comparable classification performance of CIFAR10/100 and ImageNet with accuracies of 96.43%, 78.96%, and 70.21%, respectively.
arXiv Detail & Related papers (2024-10-24T09:32:51Z) - Unveiling the Power of Sparse Neural Networks for Feature Selection [60.50319755984697]
Sparse Neural Networks (SNNs) have emerged as powerful tools for efficient feature selection.
We show that SNNs trained with dynamic sparse training (DST) algorithms can achieve, on average, more than $50%$ memory and $55%$ FLOPs reduction.
Our findings show that feature selection with SNNs trained with DST algorithms can achieve, on average, more than $50%$ memory and $55%$ FLOPs reduction.
arXiv Detail & Related papers (2024-08-08T16:48:33Z) - LC-TTFS: Towards Lossless Network Conversion for Spiking Neural Networks
with TTFS Coding [55.64533786293656]
We show that our algorithm can achieve a near-perfect mapping between the activation values of an ANN and the spike times of an SNN on a number of challenging AI tasks.
The study paves the way for deploying ultra-low-power TTFS-based SNNs on power-constrained edge computing platforms.
arXiv Detail & Related papers (2023-10-23T14:26:16Z) - High-performance deep spiking neural networks with 0.3 spikes per neuron [9.01407445068455]
It is hard to train biologically-inspired spiking neural networks (SNNs) than artificial neural networks (ANNs)
We show that training deep SNN models achieves the exact same performance as that of ANNs.
Our SNN accomplishes high-performance classification with less than 0.3 spikes per neuron, lending itself for an energy-efficient implementation.
arXiv Detail & Related papers (2023-06-14T21:01:35Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - PC-SNN: Supervised Learning with Local Hebbian Synaptic Plasticity based
on Predictive Coding in Spiking Neural Networks [1.6172800007896282]
We propose a novel learning algorithm inspired by predictive coding theory.
We show that it can perform supervised learning fully autonomously and successfully as the backprop.
This method achieves a favorable performance compared to the state-of-the-art multi-layer SNNs.
arXiv Detail & Related papers (2022-11-24T09:56:02Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Keys to Accurate Feature Extraction Using Residual Spiking Neural
Networks [1.101002667958165]
Spiking neural networks (SNNs) have become an interesting alternative to conventional artificial neural networks (ANNs)
We present a study on the key components of modern spiking architectures.
We design a spiking version of the successful residual network (ResNet) architecture and test different components and training strategies on it.
arXiv Detail & Related papers (2021-11-10T21:29:19Z) - Accurate and efficient time-domain classification with adaptive spiking
recurrent neural networks [1.8515971640245998]
Spiking neural networks (SNNs) have been investigated as more biologically plausible and potentially more powerful models of neural computation.
We show how a novel surrogate gradient combined with recurrent networks of tunable and adaptive spiking neurons yields state-of-the-art for SNNs.
arXiv Detail & Related papers (2021-03-12T10:27:29Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.