Making a Spiking Net Work: Robust brain-like unsupervised machine
learning
- URL: http://arxiv.org/abs/2208.01204v1
- Date: Tue, 2 Aug 2022 02:10:00 GMT
- Title: Making a Spiking Net Work: Robust brain-like unsupervised machine
learning
- Authors: Peter G. Stratton, Andrew Wabnitz, Chip Essam, Allen Cheung and Tara
J. Hamilton
- Abstract summary: Spiking Neural Networks (SNNs) are an alternative to Artificial Neural Networks (ANNs)
SNNs struggle with dynamical stability and cannot match the accuracy of ANNs.
We show how an SNN can overcome many of the shortcomings that have been identified in the literature.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The surge in interest in Artificial Intelligence (AI) over the past decade
has been driven almost exclusively by advances in Artificial Neural Networks
(ANNs). While ANNs set state-of-the-art performance for many previously
intractable problems, they require large amounts of data and computational
resources for training, and since they employ supervised learning they
typically need to know the correctly labelled response for every training
example, limiting their scalability for real-world domains. Spiking Neural
Networks (SNNs) are an alternative to ANNs that use more brain-like artificial
neurons and can use unsupervised learning to discover recognizable features in
the input data without knowing correct responses. SNNs, however, struggle with
dynamical stability and cannot match the accuracy of ANNs. Here we show how an
SNN can overcome many of the shortcomings that have been identified in the
literature, including offering a principled solution to the vanishing spike
problem, to outperform all existing shallow SNNs and equal the performance of
an ANN. It accomplishes this while using unsupervised learning with unlabeled
data and only 1/50th of the training epochs (labelled data is used only for a
final simple linear readout layer). This result makes SNNs a viable new method
for fast, accurate, efficient, explainable, and re-deployable machine learning
with unlabeled datasets.
Related papers
- BKDSNN: Enhancing the Performance of Learning-based Spiking Neural Networks Training with Blurred Knowledge Distillation [20.34272550256856]
Spiking neural networks (SNNs) mimic biological neural system to convey information via discrete spikes.
Our work achieves state-of-the-art performance for training SNNs on both static and neuromorphic datasets.
arXiv Detail & Related papers (2024-07-12T08:17:24Z) - Training Spiking Neural Networks with Local Tandem Learning [96.32026780517097]
Spiking neural networks (SNNs) are shown to be more biologically plausible and energy efficient than their predecessors.
In this paper, we put forward a generalized learning rule, termed Local Tandem Learning (LTL)
We demonstrate rapid network convergence within five training epochs on the CIFAR-10 dataset while having low computational complexity.
arXiv Detail & Related papers (2022-10-10T10:05:00Z) - Continual Prune-and-Select: Class-incremental learning with specialized
subnetworks [66.4795381419701]
Continual-Prune-and-Select (CP&S) is capable of sequentially learning 10 tasks from ImageNet-1000 keeping an accuracy around 94% with negligible forgetting.
This is a first-of-its-kind result in class-incremental learning.
arXiv Detail & Related papers (2022-08-09T10:49:40Z) - Toward Robust Spiking Neural Network Against Adversarial Perturbation [22.56553160359798]
spiking neural networks (SNNs) are deployed increasingly in real-world efficiency critical applications.
Researchers have already demonstrated an SNN can be attacked with adversarial examples.
To the best of our knowledge, this is the first analysis on robust training of SNNs.
arXiv Detail & Related papers (2022-04-12T21:26:49Z) - N-Omniglot: a Large-scale Neuromorphic Dataset for Spatio-Temporal
Sparse Few-shot Learning [10.812738608234321]
We provide the first neuromorphic dataset: N- Omniglot, using the Dynamic Vision Sensor (DVS)
It contains 1623 categories of handwritten characters, with only 20 samples per class.
The dataset provides a powerful challenge and a suitable benchmark for developing SNNs algorithm in the few-shot learning domain.
arXiv Detail & Related papers (2021-12-25T12:41:34Z) - A Time Encoding approach to training Spiking Neural Networks [3.655021726150368]
Spiking Neural Networks (SNNs) have been gaining in popularity.
In this paper, we provide an extra tool to help us understand and train SNNs by using theory from the field of time encoding.
arXiv Detail & Related papers (2021-10-13T14:07:11Z) - SpikeMS: Deep Spiking Neural Network for Motion Segmentation [7.491944503744111]
textitSpikeMS is the first deep encoder-decoder SNN architecture for the real-world large-scale problem of motion segmentation.
We show that textitSpikeMS is capable of textitincremental predictions, or predictions from smaller amounts of test data than it is trained on.
arXiv Detail & Related papers (2021-05-13T21:34:55Z) - S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural
Networks via Guided Distribution Calibration [74.5509794733707]
We present a novel guided learning paradigm from real-valued to distill binary networks on the final prediction distribution.
Our proposed method can boost the simple contrastive learning baseline by an absolute gain of 5.515% on BNNs.
Our method achieves substantial improvement over the simple contrastive learning baseline, and is even comparable to many mainstream supervised BNN methods.
arXiv Detail & Related papers (2021-02-17T18:59:28Z) - Deep Time Delay Neural Network for Speech Enhancement with Full Data
Learning [60.20150317299749]
This paper proposes a deep time delay neural network (TDNN) for speech enhancement with full data learning.
To make full use of the training data, we propose a full data learning method for speech enhancement.
arXiv Detail & Related papers (2020-11-11T06:32:37Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Boosting Deep Neural Networks with Geometrical Prior Knowledge: A Survey [77.99182201815763]
Deep Neural Networks (DNNs) achieve state-of-the-art results in many different problem settings.
DNNs are often treated as black box systems, which complicates their evaluation and validation.
One promising field, inspired by the success of convolutional neural networks (CNNs) in computer vision tasks, is to incorporate knowledge about symmetric geometrical transformations.
arXiv Detail & Related papers (2020-06-30T14:56:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.