Beyond Weights: Deep learning in Spiking Neural Networks with pure
synaptic-delay training
- URL: http://arxiv.org/abs/2306.06237v5
- Date: Tue, 29 Aug 2023 20:22:11 GMT
- Title: Beyond Weights: Deep learning in Spiking Neural Networks with pure
synaptic-delay training
- Authors: Edoardo W. Grappolini and Anand Subramoney
- Abstract summary: We show that training ONLY the delays in feed-forward spiking networks using backpropagation can achieve performance comparable to the more conventional weight training.
We demonstrate the task performance of delay-only training on MNIST and Fashion-MNIST datasets in preliminary experiments.
- Score: 0.9208007322096533
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Biological evidence suggests that adaptation of synaptic delays on short to
medium timescales plays an important role in learning in the brain. Inspired by
biology, we explore the feasibility and power of using synaptic delays to solve
challenging tasks even when the synaptic weights are not trained but kept at
randomly chosen fixed values. We show that training ONLY the delays in
feed-forward spiking networks using backpropagation can achieve performance
comparable to the more conventional weight training. Moreover, further
constraining the weights to ternary values does not significantly affect the
networks' ability to solve the tasks using only the synaptic delays. We
demonstrate the task performance of delay-only training on MNIST and
Fashion-MNIST datasets in preliminary experiments. This demonstrates a new
paradigm for training spiking neural networks and sets the stage for models
that can be more efficient than the ones that use weights for computation.
Related papers
- DelGrad: Exact gradients in spiking networks for learning transmission delays and weights [0.9411751957919126]
Spiking neural networks (SNNs) inherently rely on the timing of signals for representing and processing information.
Recent work has demonstrated the substantial advantages of learning these delays along with synaptic weights.
We propose an analytical approach for calculating exact loss gradients with respect to both synaptic weights and delays in an event-based fashion.
arXiv Detail & Related papers (2024-04-30T00:02:34Z) - ELiSe: Efficient Learning of Sequences in Structured Recurrent Networks [1.5931140598271163]
We build a model for efficient learning sequences using only local always-on and phase-free plasticity.
We showcase the capabilities of ELiSe in a mock-up of birdsong learning, and demonstrate its flexibility with respect to parametrization.
arXiv Detail & Related papers (2024-02-26T17:30:34Z) - NeuralFastLAS: Fast Logic-Based Learning from Raw Data [54.938128496934695]
Symbolic rule learners generate interpretable solutions, however they require the input to be encoded symbolically.
Neuro-symbolic approaches overcome this issue by mapping raw data to latent symbolic concepts using a neural network.
We introduce NeuralFastLAS, a scalable and fast end-to-end approach that trains a neural network jointly with a symbolic learner.
arXiv Detail & Related papers (2023-10-08T12:33:42Z) - Neuromorphic Online Learning for Spatiotemporal Patterns with a
Forward-only Timeline [5.094970748243019]
Spiking neural networks (SNNs) are bio-plausible computing models with high energy efficiency.
Backpropagation Through Time (BPTT) is traditionally used to train SNNs.
We present Spatiotemporal Online Learning for Synaptic Adaptation (SOLSA), specifically designed for online learning of SNNs.
arXiv Detail & Related papers (2023-07-21T02:47:03Z) - Globally Optimal Training of Neural Networks with Threshold Activation
Functions [63.03759813952481]
We study weight decay regularized training problems of deep neural networks with threshold activations.
We derive a simplified convex optimization formulation when the dataset can be shattered at a certain layer of the network.
arXiv Detail & Related papers (2023-03-06T18:59:13Z) - SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural
Networks [56.35403810762512]
Spiking neural networks (SNNs) with event-based computation are promising brain-inspired models for energy-efficient applications on neuromorphic hardware.
We study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method.
arXiv Detail & Related papers (2023-02-01T04:22:59Z) - Online Training Through Time for Spiking Neural Networks [66.7744060103562]
Spiking neural networks (SNNs) are promising brain-inspired energy-efficient models.
Recent progress in training methods has enabled successful deep SNNs on large-scale tasks with low latency.
We propose online training through time (OTTT) for SNNs, which is derived from BPTT to enable forward-in-time learning.
arXiv Detail & Related papers (2022-10-09T07:47:56Z) - Continual learning benefits from multiple sleep mechanisms: NREM, REM,
and Synaptic Downscaling [51.316408685035526]
Learning new tasks and skills in succession without losing prior learning is a computational challenge for both artificial and biological neural networks.
Here, we investigate how modeling three distinct components of mammalian sleep together affects continual learning in artificial neural networks.
arXiv Detail & Related papers (2022-09-09T13:45:27Z) - Dynamic Neural Diversification: Path to Computationally Sustainable
Neural Networks [68.8204255655161]
Small neural networks with a constrained number of trainable parameters, can be suitable resource-efficient candidates for many simple tasks.
We explore the diversity of the neurons within the hidden layer during the learning process.
We analyze how the diversity of the neurons affects predictions of the model.
arXiv Detail & Related papers (2021-09-20T15:12:16Z) - SpikePropamine: Differentiable Plasticity in Spiking Neural Networks [0.0]
We introduce a framework for learning the dynamics of synaptic plasticity and neuromodulated synaptic plasticity in Spiking Neural Networks (SNNs)
We show that SNNs augmented with differentiable plasticity are sufficient for solving a set of challenging temporal learning tasks.
These networks are also shown to be capable of producing locomotion on a high-dimensional robotic learning task.
arXiv Detail & Related papers (2021-06-04T19:29:07Z) - Bio-plausible Unsupervised Delay Learning for Extracting Temporal
Features in Spiking Neural Networks [0.548253258922555]
plasticity of the conduction delay between neurons plays a fundamental role in learning.
Understanding the precise adjustment of synaptic delays could help us in developing effective brain-inspired computational models.
arXiv Detail & Related papers (2020-11-18T16:25:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.