Supervised Learning with First-to-Spike Decoding in Multilayer Spiking
Neural Networks
- URL: http://arxiv.org/abs/2008.06937v1
- Date: Sun, 16 Aug 2020 15:34:48 GMT
- Title: Supervised Learning with First-to-Spike Decoding in Multilayer Spiking
Neural Networks
- Authors: Brian Gardner, Andr\'e Gr\"uning
- Abstract summary: We propose a new supervised learning method that can train multilayer spiking neural networks to solve classification problems.
The proposed learning rule supports multiple spikes fired by hidden neurons, and yet is stable by relying on firstspike responses generated by a deterministic output layer.
We also explore several distinct spike-based encoding strategies in order to form compact representations of input data.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Experimental studies support the notion of spike-based neuronal information
processing in the brain, with neural circuits exhibiting a wide range of
temporally-based coding strategies to rapidly and efficiently represent sensory
stimuli. Accordingly, it would be desirable to apply spike-based computation to
tackling real-world challenges, and in particular transferring such theory to
neuromorphic systems for low-power embedded applications. Motivated by this, we
propose a new supervised learning method that can train multilayer spiking
neural networks to solve classification problems based on a rapid,
first-to-spike decoding strategy. The proposed learning rule supports multiple
spikes fired by stochastic hidden neurons, and yet is stable by relying on
first-spike responses generated by a deterministic output layer. In addition to
this, we also explore several distinct, spike-based encoding strategies in
order to form compact representations of presented input data. We demonstrate
the classification performance of the learning rule as applied to several
benchmark datasets, including MNIST. The learning rule is capable of
generalising from the data, and is successful even when used with constrained
network architectures containing few input and hidden layer neurons.
Furthermore, we highlight a novel encoding strategy, termed `scanline
encoding', that can transform image data into compact spatiotemporal patterns
for subsequent network processing. Designing constrained, but optimised,
network structures and performing input dimensionality reduction has strong
implications for neuromorphic applications.
Related papers
- Towards Scalable and Versatile Weight Space Learning [51.78426981947659]
This paper introduces the SANE approach to weight-space learning.
Our method extends the idea of hyper-representations towards sequential processing of subsets of neural network weights.
arXiv Detail & Related papers (2024-06-14T13:12:07Z) - NeuralFastLAS: Fast Logic-Based Learning from Raw Data [54.938128496934695]
Symbolic rule learners generate interpretable solutions, however they require the input to be encoded symbolically.
Neuro-symbolic approaches overcome this issue by mapping raw data to latent symbolic concepts using a neural network.
We introduce NeuralFastLAS, a scalable and fast end-to-end approach that trains a neural network jointly with a symbolic learner.
arXiv Detail & Related papers (2023-10-08T12:33:42Z) - Neuromorphic Auditory Perception by Neural Spiketrum [27.871072042280712]
We introduce a neural spike coding model called spiketrumtemporal, to transform the time-varying analog signals into efficient spike patterns.
The model provides a sparse and efficient coding scheme with precisely controllable spike rate that facilitates training of spiking neural networks in various auditory perception tasks.
arXiv Detail & Related papers (2023-09-11T13:06:19Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - A Study On the Effects of Pre-processing On Spatio-temporal Action
Recognition Using Spiking Neural Networks Trained with STDP [0.0]
It is important to study the behavior of SNNs trained with unsupervised learning methods on video classification tasks.
This paper presents methods of transposing temporal information into a static format, and then transforming the visual information into spikes using latency coding.
We show the effect of the similarity in the shape and speed of certain actions on action recognition with spiking neural networks.
arXiv Detail & Related papers (2021-05-31T07:07:48Z) - Discovery of slow variables in a class of multiscale stochastic systems
via neural networks [0.0]
We propose a new method to encode in an artificial neural network a map that extracts the slow representation from the system.
We test the method on a number of examples that illustrate the ability to discover a correct slow representation.
arXiv Detail & Related papers (2021-04-28T17:48:25Z) - An error-propagation spiking neural network compatible with neuromorphic
processors [2.432141667343098]
We present a spike-based learning method that approximates back-propagation using local weight update mechanisms.
We introduce a network architecture that enables synaptic weight update mechanisms to back-propagate error signals.
This work represents a first step towards the design of ultra-low power mixed-signal neuromorphic processing systems.
arXiv Detail & Related papers (2021-04-12T07:21:08Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.