Learning Spiking Neural Network from Easy to Hard task
- URL: http://arxiv.org/abs/2309.04737v3
- Date: Tue, 26 Sep 2023 02:33:21 GMT
- Title: Learning Spiking Neural Network from Easy to Hard task
- Authors: Lingling Tang, Jiangtao Hu, Hua Yu, Surui Liu, Jielei Chu
- Abstract summary: Spiking Neural Networks (SNNs) aim to mimic the way humans process information.
Current SNNs models treat all samples equally, which does not align with the principles of human learning.
We propose a CL-SNN model that introduces Curriculum Learning into SNNs, making SNNs learn more like humans.
- Score: 1.9559989943764062
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Starting with small and simple concepts, and gradually introducing complex
and difficult concepts is the natural process of human learning. Spiking Neural
Networks (SNNs) aim to mimic the way humans process information, but current
SNNs models treat all samples equally, which does not align with the principles
of human learning and overlooks the biological plausibility of SNNs. To address
this, we propose a CL-SNN model that introduces Curriculum Learning(CL) into
SNNs, making SNNs learn more like humans and providing higher biological
interpretability. CL is a training strategy that advocates presenting easier
data to models before gradually introducing more challenging data, mimicking
the human learning process. We use a confidence-aware loss to measure and
process the samples with different difficulty levels. By learning the
confidence of different samples, the model reduces the contribution of
difficult samples to parameter optimization automatically. We conducted
experiments on static image datasets MNIST, Fashion-MNIST, CIFAR10, and
neuromorphic datasets N-MNIST, CIFAR10-DVS, DVS-Gesture. The results are
promising. To our best knowledge, this is the first proposal to enhance the
biologically plausibility of SNNs by introducing CL.
Related papers
- Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks [50.32980443749865]
Spiking neural networks (SNNs) have garnered significant attention for their low power consumption and high biologicalability.
Current SNNs struggle to balance accuracy and latency in neuromorphic datasets.
We propose Step-wise Distillation (HSD) method, tailored for neuromorphic datasets.
arXiv Detail & Related papers (2024-09-19T06:52:34Z) - Are Sparse Neural Networks Better Hard Sample Learners? [24.2141078613549]
Hard samples play a crucial role in the optimal performance of deep neural networks.
Most SNNs trained on challenging samples can often match or surpass dense models in accuracy at certain sparsity levels.
arXiv Detail & Related papers (2024-09-13T21:12:18Z) - Training Spiking Neural Networks via Augmented Direct Feedback Alignment [3.798885293742468]
Spiking neural networks (SNNs) are promising solutions for implementing neural networks in neuromorphic devices.
However, the nondifferentiable nature of SNN neurons makes it a challenge to train them.
In this paper, we propose using augmented direct feedback alignment (aDFA), a gradient-free approach based on random projection, to train SNNs.
arXiv Detail & Related papers (2024-09-12T06:22:44Z) - ESL-SNNs: An Evolutionary Structure Learning Strategy for Spiking Neural
Networks [20.33499499020257]
Spiking neural networks (SNNs) have manifested remarkable advantages in power consumption and event-driven property during the inference process.
We propose an efficient evolutionary structure learning framework for SNNs, named ESL-SNNs, to implement the sparse SNN training from scratch.
Our work presents a brand-new approach for sparse training of SNNs from scratch with biologically plausible evolutionary mechanisms.
arXiv Detail & Related papers (2023-06-06T14:06:11Z) - Constructing Deep Spiking Neural Networks from Artificial Neural
Networks with Knowledge Distillation [20.487853773309563]
Spiking neural networks (SNNs) are well known as the brain-inspired models with high computing efficiency.
We propose a novel method of constructing deep SNN models with knowledge distillation (KD)
arXiv Detail & Related papers (2023-04-12T05:57:21Z) - Toward Robust Spiking Neural Network Against Adversarial Perturbation [22.56553160359798]
spiking neural networks (SNNs) are deployed increasingly in real-world efficiency critical applications.
Researchers have already demonstrated an SNN can be attacked with adversarial examples.
To the best of our knowledge, this is the first analysis on robust training of SNNs.
arXiv Detail & Related papers (2022-04-12T21:26:49Z) - Rethinking Nearest Neighbors for Visual Classification [56.00783095670361]
k-NN is a lazy learning method that aggregates the distance between the test image and top-k neighbors in a training set.
We adopt k-NN with pre-trained visual representations produced by either supervised or self-supervised methods in two steps.
Via extensive experiments on a wide range of classification tasks, our study reveals the generality and flexibility of k-NN integration.
arXiv Detail & Related papers (2021-12-15T20:15:01Z) - S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural
Networks via Guided Distribution Calibration [74.5509794733707]
We present a novel guided learning paradigm from real-valued to distill binary networks on the final prediction distribution.
Our proposed method can boost the simple contrastive learning baseline by an absolute gain of 5.515% on BNNs.
Our method achieves substantial improvement over the simple contrastive learning baseline, and is even comparable to many mainstream supervised BNN methods.
arXiv Detail & Related papers (2021-02-17T18:59:28Z) - Exploiting Heterogeneity in Operational Neural Networks by Synaptic
Plasticity [87.32169414230822]
Recently proposed network model, Operational Neural Networks (ONNs), can generalize the conventional Convolutional Neural Networks (CNNs)
In this study the focus is drawn on searching the best-possible operator set(s) for the hidden neurons of the network based on the Synaptic Plasticity paradigm that poses the essential learning theory in biological neurons.
Experimental results over highly challenging problems demonstrate that the elite ONNs even with few neurons and layers can achieve a superior learning performance than GIS-based ONNs.
arXiv Detail & Related papers (2020-08-21T19:03:23Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Neural Additive Models: Interpretable Machine Learning with Neural Nets [77.66871378302774]
Deep neural networks (DNNs) are powerful black-box predictors that have achieved impressive performance on a wide variety of tasks.
We propose Neural Additive Models (NAMs) which combine some of the expressivity of DNNs with the inherent intelligibility of generalized additive models.
NAMs learn a linear combination of neural networks that each attend to a single input feature.
arXiv Detail & Related papers (2020-04-29T01:28:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.