Online Few-shot Gesture Learning on a Neuromorphic Processor
- URL: http://arxiv.org/abs/2008.01151v2
- Date: Wed, 14 Oct 2020 05:13:05 GMT
- Title: Online Few-shot Gesture Learning on a Neuromorphic Processor
- Authors: Kenneth Stewart, Garrick Orchard, Sumit Bam Shrestha, Emre Neftci
- Abstract summary: We present the Surrogate-gradient Online Error-triggered Learning (SOEL) system for online few-shot learning on neuromorphic processors.
SOEL updates trigger when an error occurs, enabling faster learning with fewer updates.
- Score: 9.084047904714629
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present the Surrogate-gradient Online Error-triggered Learning (SOEL)
system for online few-shot learning on neuromorphic processors. The SOEL
learning system uses a combination of transfer learning and principles of
computational neuroscience and deep learning. We show that partially trained
deep Spiking Neural Networks (SNNs) implemented on neuromorphic hardware can
rapidly adapt online to new classes of data within a domain. SOEL updates
trigger when an error occurs, enabling faster learning with fewer updates.
Using gesture recognition as a case study, we show SOEL can be used for online
few-shot learning of new classes of pre-recorded gesture data and rapid online
learning of new gestures from data streamed live from a Dynamic Active-pixel
Vision Sensor to an Intel Loihi neuromorphic research processor.
Related papers
- Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Hebbian Learning based Orthogonal Projection for Continual Learning of
Spiking Neural Networks [74.3099028063756]
We develop a new method with neuronal operations based on lateral connections and Hebbian learning.
We show that Hebbian and anti-Hebbian learning on recurrent lateral connections can effectively extract the principal subspace of neural activities.
Our method consistently solves for spiking neural networks with nearly zero forgetting.
arXiv Detail & Related papers (2024-02-19T09:29:37Z) - Learning to learn online with neuromodulated synaptic plasticity in
spiking neural networks [0.0]
We show that models of neuromodulated synaptic plasticity from neuroscience can be trained to learn through gradient descent.
This framework opens a new path toward developing neuroscience inspired online learning algorithms.
arXiv Detail & Related papers (2022-06-25T00:28:40Z) - Self-Supervised Learning of Event-Based Optical Flow with Spiking Neural
Networks [3.7384509727711923]
A major challenge for neuromorphic computing is that learning algorithms for traditional artificial neural networks (ANNs) do not transfer directly to spiking neural networks (SNNs)
In this article, we focus on the self-supervised learning problem of optical flow estimation from event-based camera inputs.
We show that the performance of the proposed ANNs and SNNs are on par with that of the current state-of-the-art ANNs trained in a self-supervised manner.
arXiv Detail & Related papers (2021-06-03T14:03:41Z) - In-Hardware Learning of Multilayer Spiking Neural Networks on a
Neuromorphic Processor [6.816315761266531]
This work presents a spike-based backpropagation algorithm with biological plausible local update rules and adapts it to fit the constraint in a neuromorphic hardware.
The algorithm is implemented on Intel Loihi chip enabling low power in- hardware supervised online learning of multilayered SNNs for mobile applications.
arXiv Detail & Related papers (2021-05-08T09:22:21Z) - On-Chip Error-triggered Learning of Multi-layer Memristive Spiking
Neural Networks [1.7958576850695402]
We propose a local, gradient-based, error-triggered learning algorithm with online ternary weight updates.
The proposed algorithm enables online training of multi-layer SNNs with memristive neuromorphic hardware.
arXiv Detail & Related papers (2020-11-21T19:44:19Z) - Artificial Neural Variability for Deep Learning: On Overfitting, Noise
Memorization, and Catastrophic Forgetting [135.0863818867184]
artificial neural variability (ANV) helps artificial neural networks learn some advantages from natural'' neural networks.
ANV plays as an implicit regularizer of the mutual information between the training data and the learned model.
It can effectively relieve overfitting, label noise memorization, and catastrophic forgetting at negligible costs.
arXiv Detail & Related papers (2020-11-12T06:06:33Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Curriculum By Smoothing [52.08553521577014]
Convolutional Neural Networks (CNNs) have shown impressive performance in computer vision tasks such as image classification, detection, and segmentation.
We propose an elegant curriculum based scheme that smoothes the feature embedding of a CNN using anti-aliasing or low-pass filters.
As the amount of information in the feature maps increases during training, the network is able to progressively learn better representations of the data.
arXiv Detail & Related papers (2020-03-03T07:27:44Z) - Learning to Continually Learn [14.988129334830003]
We propose A Neuromodulated Meta-Learning Algorithm (ANML)
Inspired by neuromodulatory processes in the brain, we propose A Neuromodulated Meta-Learning Algorithm (ANML)
ANML produces state-of-the-art continual learning performance, sequentially learning as many as 600 classes (over 9,000 SGD updates)
arXiv Detail & Related papers (2020-02-21T22:52:00Z) - Non-linear Neurons with Human-like Apical Dendrite Activations [81.18416067005538]
We show that a standard neuron followed by our novel apical dendrite activation (ADA) can learn the XOR logical function with 100% accuracy.
We conduct experiments on six benchmark data sets from computer vision, signal processing and natural language processing.
arXiv Detail & Related papers (2020-02-02T21:09:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.