VOWEL: A Local Online Learning Rule for Recurrent Networks of
Probabilistic Spiking Winner-Take-All Circuits
- URL: http://arxiv.org/abs/2004.09416v1
- Date: Mon, 20 Apr 2020 16:21:18 GMT
- Title: VOWEL: A Local Online Learning Rule for Recurrent Networks of
Probabilistic Spiking Winner-Take-All Circuits
- Authors: Hyeryung Jang, Nicolas Skatchkovsky and Osvaldo Simeone
- Abstract summary: WTA-SNNs can detect information in-valued multi-valued events.
Existing schemes for training WTA-SNNs are limited to rate-encoding solutions.
We develop a variational online local training rule for WTA-SNNs, referred to as VOWEL.
- Score: 38.518936229794214
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Networks of spiking neurons and Winner-Take-All spiking circuits (WTA-SNNs)
can detect information encoded in spatio-temporal multi-valued events. These
are described by the timing of events of interest, e.g., clicks, as well as by
categorical numerical values assigned to each event, e.g., like or dislike.
Other use cases include object recognition from data collected by neuromorphic
cameras, which produce, for each pixel, signed bits at the times of
sufficiently large brightness variations. Existing schemes for training
WTA-SNNs are limited to rate-encoding solutions, and are hence able to detect
only spatial patterns. Developing more general training algorithms for
arbitrary WTA-SNNs inherits the challenges of training (binary) Spiking Neural
Networks (SNNs). These amount, most notably, to the non-differentiability of
threshold functions, to the recurrent behavior of spiking neural models, and to
the difficulty of implementing backpropagation in neuromorphic hardware. In
this paper, we develop a variational online local training rule for WTA-SNNs,
referred to as VOWEL, that leverages only local pre- and post-synaptic
information for visible circuits, and an additional common reward signal for
hidden circuits. The method is based on probabilistic generalized linear neural
models, control variates, and variational regularization. Experimental results
on real-world neuromorphic datasets with multi-valued events demonstrate the
advantages of WTA-SNNs over conventional binary SNNs trained with
state-of-the-art methods, especially in the presence of limited computing
resources.
Related papers
- How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - A Temporal Neural Network Architecture for Online Learning [0.6091702876917281]
Temporal neural networks (TNNs) communicate and process information encoded as relative spike times.
A TNN architecture is proposed and, as a proof-of-concept, TNN operation is demonstrated within the larger context of online supervised classification.
arXiv Detail & Related papers (2020-11-27T17:15:29Z) - Spiking Neural Networks -- Part II: Detecting Spatio-Temporal Patterns [38.518936229794214]
Spiking Neural Networks (SNNs) have the unique ability to detect information in encoded-temporal signals.
We review models and training algorithms for the dominant approach that considers SNNs as a Recurrent Neural Network (RNN)
We describe an alternative approach that relies on probabilistic models for spiking neurons, allowing the derivation of local learning rules via gradient estimates.
arXiv Detail & Related papers (2020-10-27T11:47:42Z) - Skip-Connected Self-Recurrent Spiking Neural Networks with Joint
Intrinsic Parameter and Synaptic Weight Training [14.992756670960008]
We propose a new type of RSNN called Skip-Connected Self-Recurrent SNNs (ScSr-SNNs)
ScSr-SNNs can boost performance by up to 2.55% compared with other types of RSNNs trained by state-of-the-art BP methods.
arXiv Detail & Related papers (2020-10-23T22:27:13Z) - Exploiting Heterogeneity in Operational Neural Networks by Synaptic
Plasticity [87.32169414230822]
Recently proposed network model, Operational Neural Networks (ONNs), can generalize the conventional Convolutional Neural Networks (CNNs)
In this study the focus is drawn on searching the best-possible operator set(s) for the hidden neurons of the network based on the Synaptic Plasticity paradigm that poses the essential learning theory in biological neurons.
Experimental results over highly challenging problems demonstrate that the elite ONNs even with few neurons and layers can achieve a superior learning performance than GIS-based ONNs.
arXiv Detail & Related papers (2020-08-21T19:03:23Z) - Multi-Sample Online Learning for Probabilistic Spiking Neural Networks [43.8805663900608]
Spiking Neural Networks (SNNs) capture some of the efficiency of biological brains for inference and learning.
This paper introduces an online learning rule based on generalized expectation-maximization (GEM)
Experimental results on structured output memorization and classification on a standard neuromorphic data set demonstrate significant improvements in terms of log-likelihood, accuracy, and calibration.
arXiv Detail & Related papers (2020-07-23T10:03:58Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.