Brain-Inspired Learning on Neuromorphic Substrates
- URL: http://arxiv.org/abs/2010.11931v1
- Date: Thu, 22 Oct 2020 17:56:59 GMT
- Title: Brain-Inspired Learning on Neuromorphic Substrates
- Authors: Friedemann Zenke and Emre O. Neftci
- Abstract summary: This article provides a mathematical framework for the design of practical online learning algorithms for neuromorphic substrates.
Specifically, we show a direct connection between Real-Time Recurrent Learning (RTRL) and biologically plausible learning rules for training Spiking Neural Networks (SNNs)
We motivate a sparse approximation based on block-diagonal Jacobians, which reduces the algorithm's computational complexity.
- Score: 5.279475826661643
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neuromorphic hardware strives to emulate brain-like neural networks and thus
holds the promise for scalable, low-power information processing on temporal
data streams. Yet, to solve real-world problems, these networks need to be
trained. However, training on neuromorphic substrates creates significant
challenges due to the offline character and the required non-local computations
of gradient-based learning algorithms. This article provides a mathematical
framework for the design of practical online learning algorithms for
neuromorphic substrates. Specifically, we show a direct connection between
Real-Time Recurrent Learning (RTRL), an online algorithm for computing
gradients in conventional Recurrent Neural Networks (RNNs), and biologically
plausible learning rules for training Spiking Neural Networks (SNNs). Further,
we motivate a sparse approximation based on block-diagonal Jacobians, which
reduces the algorithm's computational complexity, diminishes the non-local
information requirements, and empirically leads to good learning performance,
thereby improving its applicability to neuromorphic substrates. In summary, our
framework bridges the gap between synaptic plasticity and gradient-based
approaches from deep learning and lays the foundations for powerful information
processing on future neuromorphic hardware systems.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Advanced Computing and Related Applications Leveraging Brain-inspired
Spiking Neural Networks [0.0]
Spiking neural network is one of the cores of artificial intelligence which realizes brain-like computing.
This paper summarizes the strengths, weaknesses and applicability of five neuronal models and analyzes the characteristics of five network topologies.
arXiv Detail & Related papers (2023-09-08T16:41:08Z) - Deep Learning Meets Sparse Regularization: A Signal Processing
Perspective [17.12783792226575]
We present a mathematical framework that characterizes the functional properties of neural networks that are trained to fit to data.
Key mathematical tools which support this framework include transform-domain sparse regularization, the Radon transform of computed tomography, and approximation theory.
This framework explains the effect of weight decay regularization in neural network training, the use of skip connections and low-rank weight matrices in network architectures, the role of sparsity in neural networks, and explains why neural networks can perform well in high-dimensional problems.
arXiv Detail & Related papers (2023-01-23T17:16:21Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Predictive Coding: Towards a Future of Deep Learning beyond
Backpropagation? [41.58529335439799]
The backpropagation of error algorithm used to train deep neural networks has been fundamental to the successes of deep learning.
Recent work has developed the idea into a general-purpose algorithm able to train neural networks using only local computations.
We show the substantially greater flexibility of predictive coding networks against equivalent deep neural networks.
arXiv Detail & Related papers (2022-02-18T22:57:03Z) - On-Chip Error-triggered Learning of Multi-layer Memristive Spiking
Neural Networks [1.7958576850695402]
We propose a local, gradient-based, error-triggered learning algorithm with online ternary weight updates.
The proposed algorithm enables online training of multi-layer SNNs with memristive neuromorphic hardware.
arXiv Detail & Related papers (2020-11-21T19:44:19Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Spiking Neural Networks Hardware Implementations and Challenges: a
Survey [53.429871539789445]
Spiking Neural Networks are cognitive algorithms mimicking neuron and synapse operational principles.
We present the state of the art of hardware implementations of spiking neural networks.
We discuss the strategies employed to leverage the characteristics of these event-driven algorithms at the hardware level.
arXiv Detail & Related papers (2020-05-04T13:24:00Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z) - Structural plasticity on an accelerated analog neuromorphic hardware
system [0.46180371154032884]
We present a strategy to achieve structural plasticity by constantly rewiring the pre- and gpostsynaptic partners.
We implemented this algorithm on the analog neuromorphic system BrainScaleS-2.
We evaluated our implementation in a simple supervised learning scenario, showing its ability to optimize the network topology.
arXiv Detail & Related papers (2019-12-27T10:15:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.