Neuromorphic Data Augmentation for Training Spiking Neural Networks
- URL: http://arxiv.org/abs/2203.06145v1
- Date: Fri, 11 Mar 2022 18:17:19 GMT
- Title: Neuromorphic Data Augmentation for Training Spiking Neural Networks
- Authors: Yuhang Li, Youngeun Kim, Hyoungseob Park, Tamar Geller, Priyadarshini
Panda
- Abstract summary: We propose neuromorphic data augmentation (NDA) for event-based datasets.
NDA significantly stabilizes the SNN training and reduces the generalization gap between training and test performance.
For the first time, we demonstrate the feasibility of unsupervised contrastive learning for SNNs.
- Score: 10.303676184878896
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Developing neuromorphic intelligence on event-based datasets with spiking
neural networks (SNNs) has recently attracted much research attention. However,
the limited size of event-based datasets makes SNNs prone to overfitting and
unstable convergence. This issue remains unexplored by previous academic works.
In an effort to minimize this generalization gap, we propose neuromorphic data
augmentation (NDA), a family of geometric augmentations specifically designed
for event-based datasets with the goal of significantly stabilizing the SNN
training and reducing the generalization gap between training and test
performance. The proposed method is simple and compatible with existing SNN
training pipelines. Using the proposed augmentation, for the first time, we
demonstrate the feasibility of unsupervised contrastive learning for SNNs. We
conduct comprehensive experiments on prevailing neuromorphic vision benchmarks
and show that NDA yields substantial improvements over previous
state-of-the-art results. For example, NDA-based SNN achieves accuracy gain on
CIFAR10-DVS and N-Caltech 101 by 10.1% and 13.7%, respectively.
Related papers
- Membership Privacy Evaluation in Deep Spiking Neural Networks [32.42695393291052]
Spiking Neural Networks (SNNs) mimic neurons with non-linear functions to output floating-point numbers.
In this paper, we evaluate the membership privacy of SNNs by considering eight MIAs.
We show that SNNs are more vulnerable (maximum 10% higher in terms of balanced attack accuracy) than ANNs when both are trained with neuromorphic datasets.
arXiv Detail & Related papers (2024-09-28T17:13:04Z) - Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks [50.32980443749865]
Spiking neural networks (SNNs) have garnered significant attention for their low power consumption and high biologicalability.
Current SNNs struggle to balance accuracy and latency in neuromorphic datasets.
We propose Step-wise Distillation (HSD) method, tailored for neuromorphic datasets.
arXiv Detail & Related papers (2024-09-19T06:52:34Z) - Advancing Spiking Neural Networks towards Multiscale Spatiotemporal Interaction Learning [10.702093960098106]
Spiking Neural Networks (SNNs) serve as an energy-efficient alternative to Artificial Neural Networks (ANNs)
We have designed a Spiking Multiscale Attention (SMA) module that captures multiscaletemporal interaction information.
Our approach has achieved state-of-the-art results on mainstream neural datasets.
arXiv Detail & Related papers (2024-05-22T14:16:05Z) - Transferability of coVariance Neural Networks and Application to
Interpretable Brain Age Prediction using Anatomical Features [119.45320143101381]
Graph convolutional networks (GCN) leverage topology-driven graph convolutional operations to combine information across the graph for inference tasks.
We have studied GCNs with covariance matrices as graphs in the form of coVariance neural networks (VNNs)
VNNs inherit the scale-free data processing architecture from GCNs and here, we show that VNNs exhibit transferability of performance over datasets whose covariance matrices converge to a limit object.
arXiv Detail & Related papers (2023-05-02T22:15:54Z) - Optimising Event-Driven Spiking Neural Network with Regularisation and
Cutoff [33.91830001268308]
Spiking neural network (SNN) offers promising improvements in computational efficiency.
Current SNN training methodologies predominantly employ a fixed timestep approach.
We propose to consider cutoff in SNN, which can terminate SNN anytime during the inference to achieve efficient inference.
arXiv Detail & Related papers (2023-01-23T16:14:09Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Toward Robust Spiking Neural Network Against Adversarial Perturbation [22.56553160359798]
spiking neural networks (SNNs) are deployed increasingly in real-world efficiency critical applications.
Researchers have already demonstrated an SNN can be attacked with adversarial examples.
To the best of our knowledge, this is the first analysis on robust training of SNNs.
arXiv Detail & Related papers (2022-04-12T21:26:49Z) - Spatial-Temporal-Fusion BNN: Variational Bayesian Feature Layer [77.78479877473899]
We design a spatial-temporal-fusion BNN for efficiently scaling BNNs to large models.
Compared to vanilla BNNs, our approach can greatly reduce the training time and the number of parameters, which contributes to scale BNNs efficiently.
arXiv Detail & Related papers (2021-12-12T17:13:14Z) - Shift-Robust GNNs: Overcoming the Limitations of Localized Graph
Training data [52.771780951404565]
Shift-Robust GNN (SR-GNN) is designed to account for distributional differences between biased training data and the graph's true inference distribution.
We show that SR-GNN outperforms other GNN baselines by accuracy, eliminating at least (40%) of the negative effects introduced by biased training data.
arXiv Detail & Related papers (2021-08-02T18:00:38Z) - Deep Time Delay Neural Network for Speech Enhancement with Full Data
Learning [60.20150317299749]
This paper proposes a deep time delay neural network (TDNN) for speech enhancement with full data learning.
To make full use of the training data, we propose a full data learning method for speech enhancement.
arXiv Detail & Related papers (2020-11-11T06:32:37Z) - Comparing SNNs and RNNs on Neuromorphic Vision Datasets: Similarities
and Differences [36.82069150045153]
Spiking neural networks (SNNs) and recurrent neural networks (RNNs) are benchmarked on neuromorphic data.
In this work, we make a systematic study to compare SNNs and RNNs on neuromorphic data.
arXiv Detail & Related papers (2020-05-02T10:19:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.