Supervised Learning in Temporally-Coded Spiking Neural Networks with
Approximate Backpropagation
- URL: http://arxiv.org/abs/2007.13296v1
- Date: Mon, 27 Jul 2020 03:39:49 GMT
- Title: Supervised Learning in Temporally-Coded Spiking Neural Networks with
Approximate Backpropagation
- Authors: Andrew Stephan, Brian Gardner, Steven J. Koester, Andre Gruning
- Abstract summary: We propose a new supervised learning method for temporally-encoded multilayer spiking networks to perform classification.
The method employs a reinforcement signal that mimics backpropagation but is far less computationally intensive.
In simulated MNIST handwritten digit classification, two-layer networks trained with this rule matched the performance of a comparable backpropagation based non-spiking network.
- Score: 0.021506382989223777
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work we propose a new supervised learning method for
temporally-encoded multilayer spiking networks to perform classification. The
method employs a reinforcement signal that mimics backpropagation but is far
less computationally intensive. The weight update calculation at each layer
requires only local data apart from this signal. We also employ a rule capable
of producing specific output spike trains; by setting the target spike time
equal to the actual spike time with a slight negative offset for key high-value
neurons the actual spike time becomes as early as possible. In simulated MNIST
handwritten digit classification, two-layer networks trained with this rule
matched the performance of a comparable backpropagation based non-spiking
network.
Related papers
- How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Properties and Potential Applications of Random Functional-Linked Types
of Neural Networks [81.56822938033119]
Random functional-linked neural networks (RFLNNs) offer an alternative way of learning in deep structure.
This paper gives some insights into the properties of RFLNNs from the viewpoints of frequency domain.
We propose a method to generate a BLS network with better performance, and design an efficient algorithm for solving Poison's equation.
arXiv Detail & Related papers (2023-04-03T13:25:22Z) - Desire Backpropagation: A Lightweight Training Algorithm for Multi-Layer
Spiking Neural Networks based on Spike-Timing-Dependent Plasticity [13.384228628766236]
Spiking neural networks (SNNs) are a viable alternative to conventional artificial neural networks.
We present desire backpropagation, a method to derive the desired spike activity of all neurons, including the hidden ones.
We trained three-layer networks to classify MNIST and Fashion-MNIST images and reached an accuracy of 98.41% and 87.56%, respectively.
arXiv Detail & Related papers (2022-11-10T08:32:13Z) - Local learning through propagation delays in spiking neural networks [0.0]
We propose a novel local learning rule for spiking neural networks in which spike propagation times undergo activity-dependent plasticity.
We demonstrate the use of this method in a three-layer feedfoward network with inputs from a database of handwritten digits.
arXiv Detail & Related papers (2022-10-27T13:48:40Z) - NAF: Neural Attenuation Fields for Sparse-View CBCT Reconstruction [79.13750275141139]
This paper proposes a novel and fast self-supervised solution for sparse-view CBCT reconstruction.
The desired attenuation coefficients are represented as a continuous function of 3D spatial coordinates, parameterized by a fully-connected deep neural network.
A learning-based encoder entailing hash coding is adopted to help the network capture high-frequency details.
arXiv Detail & Related papers (2022-09-29T04:06:00Z) - Online learning of windmill time series using Long Short-term Cognitive
Networks [58.675240242609064]
The amount of data generated on windmill farms makes online learning the most viable strategy to follow.
We use Long Short-term Cognitive Networks (LSTCNs) to forecast windmill time series in online settings.
Our approach reported the lowest forecasting errors with respect to a simple RNN, a Long Short-term Memory, a Gated Recurrent Unit, and a Hidden Markov Model.
arXiv Detail & Related papers (2021-07-01T13:13:24Z) - Spiking-GAN: A Spiking Generative Adversarial Network Using
Time-To-First-Spike Coding [0.0]
We propose Spiking-GAN, the first spike-based Generative Adversarial Network (GAN)
It employs a kind of temporal coding scheme called time-to-first-spike coding.
Our modified temporal loss function called 'Aggressive TTFS' improves the inference time of the network by over 33% and reduces the number of spikes in the network by more than 11% compared to previous works.
arXiv Detail & Related papers (2021-06-29T13:43:07Z) - Local Critic Training for Model-Parallel Learning of Deep Neural
Networks [94.69202357137452]
We propose a novel model-parallel learning method, called local critic training.
We show that the proposed approach successfully decouples the update process of the layer groups for both convolutional neural networks (CNNs) and recurrent neural networks (RNNs)
We also show that trained networks by the proposed method can be used for structural optimization.
arXiv Detail & Related papers (2021-02-03T09:30:45Z) - Event-Based Backpropagation can compute Exact Gradients for Spiking
Neural Networks [0.0]
Spiking neural networks combine analog computation with event-based communication using discrete spikes.
For the first time, this work derives the backpropagation algorithm for a continuous-time spiking neural network and a general loss function.
We use gradients computed via EventProp to train networks on the Yin-Yang and MNIST datasets using either a spike time or voltage based loss function and report competitive performance.
arXiv Detail & Related papers (2020-09-17T15:45:00Z) - Coarse scale representation of spiking neural networks: backpropagation
through spikes and application to neuromorphic hardware [0.0]
We explore recurrent representations of leaky integrate and fire neurons operating at a timescale equal to their absolute refractory period.
We find that the recurrent model leads to high classification accuracy using just 4-long spike trains during training.
We also observed a good transfer back to continuous implementations of leaky integrate and fire neurons.
arXiv Detail & Related papers (2020-07-13T04:02:35Z) - Rapid Structural Pruning of Neural Networks with Set-based Task-Adaptive
Meta-Pruning [83.59005356327103]
A common limitation of most existing pruning techniques is that they require pre-training of the network at least once before pruning.
We propose STAMP, which task-adaptively prunes a network pretrained on a large reference dataset by generating a pruning mask on it as a function of the target dataset.
We validate STAMP against recent advanced pruning methods on benchmark datasets.
arXiv Detail & Related papers (2020-06-22T10:57:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.