EqSpike: Spike-driven Equilibrium Propagation for Neuromorphic
Implementations
- URL: http://arxiv.org/abs/2010.07859v3
- Date: Wed, 17 Feb 2021 14:48:02 GMT
- Title: EqSpike: Spike-driven Equilibrium Propagation for Neuromorphic
Implementations
- Authors: Erwann Martin, Maxence Ernoult, J\'er\'emie Laydevant, Shuai Li,
Damien Querlioz, Teodora Petrisor, Julie Grollier
- Abstract summary: We develop a spiking neural network algorithm called EqSpike, compatible with neuromorphic systems.
We show that EqSpike implemented in silicon neuromorphic technology could reduce the energy consumption of inference and training respectively.
- Score: 9.952561670370804
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Finding spike-based learning algorithms that can be implemented within the
local constraints of neuromorphic systems, while achieving high accuracy,
remains a formidable challenge. Equilibrium Propagation is a promising
alternative to backpropagation as it only involves local computations, but
hardware-oriented studies have so far focused on rate-based networks. In this
work, we develop a spiking neural network algorithm called EqSpike, compatible
with neuromorphic systems, which learns by Equilibrium Propagation. Through
simulations, we obtain a test recognition accuracy of 97.6% on MNIST, similar
to rate-based Equilibrium Propagation, and comparing favourably to alternative
learning techniques for spiking neural networks. We show that EqSpike
implemented in silicon neuromorphic technology could reduce the energy
consumption of inference and training respectively by three orders and two
orders of magnitude compared to GPUs. Finally, we also show that during
learning, EqSpike weight updates exhibit a form of Spike Timing Dependent
Plasticity, highlighting a possible connection with biology.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Advancing Spatio-Temporal Processing in Spiking Neural Networks through Adaptation [6.233189707488025]
In this article, we analyze the dynamical, computational, and learning properties of adaptive LIF neurons and networks thereof.
We show that the superiority of networks of adaptive LIF neurons extends to the prediction and generation of complex time series.
arXiv Detail & Related papers (2024-08-14T12:49:58Z) - Neuromorphic Auditory Perception by Neural Spiketrum [27.871072042280712]
We introduce a neural spike coding model called spiketrumtemporal, to transform the time-varying analog signals into efficient spike patterns.
The model provides a sparse and efficient coding scheme with precisely controllable spike rate that facilitates training of spiking neural networks in various auditory perception tasks.
arXiv Detail & Related papers (2023-09-11T13:06:19Z) - SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural
Networks [56.35403810762512]
Spiking neural networks (SNNs) with event-based computation are promising brain-inspired models for energy-efficient applications on neuromorphic hardware.
We study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method.
arXiv Detail & Related papers (2023-02-01T04:22:59Z) - ETLP: Event-based Three-factor Local Plasticity for online learning with
neuromorphic hardware [105.54048699217668]
We show a competitive performance in accuracy with a clear advantage in the computational complexity for Event-Based Three-factor Local Plasticity (ETLP)
We also show that when using local plasticity, threshold adaptation in spiking neurons and a recurrent topology are necessary to learntemporal patterns with a rich temporal structure.
arXiv Detail & Related papers (2023-01-19T19:45:42Z) - Bayesian Continual Learning via Spiking Neural Networks [38.518936229794214]
We take steps towards the design of neuromorphic systems that are capable of adaptation to changing learning tasks.
We derive online learning rules for spiking neural networks (SNNs) within a Bayesian continual learning framework.
We instantiate the proposed approach for both real-valued and binary synaptic weights.
arXiv Detail & Related papers (2022-08-29T17:11:14Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z) - Structural plasticity on an accelerated analog neuromorphic hardware
system [0.46180371154032884]
We present a strategy to achieve structural plasticity by constantly rewiring the pre- and gpostsynaptic partners.
We implemented this algorithm on the analog neuromorphic system BrainScaleS-2.
We evaluated our implementation in a simple supervised learning scenario, showing its ability to optimize the network topology.
arXiv Detail & Related papers (2019-12-27T10:15:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.