Fluctuation-driven initialization for spiking neural network training
- URL: http://arxiv.org/abs/2206.10226v1
- Date: Tue, 21 Jun 2022 09:48:49 GMT
- Title: Fluctuation-driven initialization for spiking neural network training
- Authors: Julian Rossbroich, Julia Gygax, and Friedemann Zenke
- Abstract summary: Spiking neural networks (SNNs) underlie low-power, fault-tolerant information processing in the brain.
We develop a general strategy for SNNs inspired by the fluctuation-driven regime commonly observed in the brain.
- Score: 3.976291254896486
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spiking neural networks (SNNs) underlie low-power, fault-tolerant information
processing in the brain and could constitute a power-efficient alternative to
conventional deep neural networks when implemented on suitable neuromorphic
hardware accelerators. However, instantiating SNNs that solve complex
computational tasks in-silico remains a significant challenge. Surrogate
gradient (SG) techniques have emerged as a standard solution for training SNNs
end-to-end. Still, their success depends on synaptic weight initialization,
similar to conventional artificial neural networks (ANNs). Yet, unlike in the
case of ANNs, it remains elusive what constitutes a good initial state for an
SNN. Here, we develop a general initialization strategy for SNNs inspired by
the fluctuation-driven regime commonly observed in the brain. Specifically, we
derive practical solutions for data-dependent weight initialization that ensure
fluctuation-driven firing in the widely used leaky integrate-and-fire (LIF)
neurons. We empirically show that SNNs initialized following our strategy
exhibit superior learning performance when trained with SGs. These findings
generalize across several datasets and SNN architectures, including fully
connected, deep convolutional, recurrent, and more biologically plausible SNNs
obeying Dale's law. Thus fluctuation-driven initialization provides a
practical, versatile, and easy-to-implement strategy for improving SNN training
performance on diverse tasks in neuromorphic engineering and computational
neuroscience.
Related papers
- Scalable Mechanistic Neural Networks [52.28945097811129]
We propose an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences.
By reformulating the original Mechanistic Neural Network (MNN) we reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear.
Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources.
arXiv Detail & Related papers (2024-10-08T14:27:28Z) - Deep activity propagation via weight initialization in spiking neural networks [10.69085409825724]
Spiking Neural Networks (SNNs) offer bio-inspired advantages such as sparsity and ultra-low power consumption.
Deep SNNs process and transmit information by quantizing the real-valued membrane potentials into binary spikes.
We show theoretically that, unlike standard approaches, this method enables the propagation of activity in deep SNNs without loss of spikes.
arXiv Detail & Related papers (2024-10-01T11:02:34Z) - Training Spiking Neural Networks via Augmented Direct Feedback Alignment [3.798885293742468]
Spiking neural networks (SNNs) are promising solutions for implementing neural networks in neuromorphic devices.
However, the nondifferentiable nature of SNN neurons makes it a challenge to train them.
In this paper, we propose using augmented direct feedback alignment (aDFA), a gradient-free approach based on random projection, to train SNNs.
arXiv Detail & Related papers (2024-09-12T06:22:44Z) - Stochastic Spiking Neural Networks with First-to-Spike Coding [7.955633422160267]
Spiking Neural Networks (SNNs) are known for their bio-plausibility and energy efficiency.
In this work, we explore the merger of novel computing and information encoding schemes in SNN architectures.
We investigate the tradeoffs of our proposal in terms of accuracy, inference latency, spiking sparsity, energy consumption, and datasets.
arXiv Detail & Related papers (2024-04-26T22:52:23Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - LC-TTFS: Towards Lossless Network Conversion for Spiking Neural Networks
with TTFS Coding [55.64533786293656]
We show that our algorithm can achieve a near-perfect mapping between the activation values of an ANN and the spike times of an SNN on a number of challenging AI tasks.
The study paves the way for deploying ultra-low-power TTFS-based SNNs on power-constrained edge computing platforms.
arXiv Detail & Related papers (2023-10-23T14:26:16Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Deep Reinforcement Learning Guided Graph Neural Networks for Brain
Network Analysis [61.53545734991802]
We propose a novel brain network representation framework, namely BN-GNN, which searches for the optimal GNN architecture for each brain network.
Our proposed BN-GNN improves the performance of traditional GNNs on different brain network analysis tasks.
arXiv Detail & Related papers (2022-03-18T07:05:27Z) - Self-Supervised Learning of Event-Based Optical Flow with Spiking Neural
Networks [3.7384509727711923]
A major challenge for neuromorphic computing is that learning algorithms for traditional artificial neural networks (ANNs) do not transfer directly to spiking neural networks (SNNs)
In this article, we focus on the self-supervised learning problem of optical flow estimation from event-based camera inputs.
We show that the performance of the proposed ANNs and SNNs are on par with that of the current state-of-the-art ANNs trained in a self-supervised manner.
arXiv Detail & Related papers (2021-06-03T14:03:41Z) - Long Short-Term Memory Spiking Networks and Their Applications [10.071615423169902]
We present a novel framework for training recurrent spiking neural networks (SNNs)
We show that LSTM spiking networks learn the timing of the spikes and temporal dependencies.
We also develop a methodology for error backpropagation within LSTM-based SNNs.
arXiv Detail & Related papers (2020-07-09T13:22:27Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.