SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence
- URL: http://arxiv.org/abs/2310.16620v1
- Date: Wed, 25 Oct 2023 13:15:17 GMT
- Title: SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence
- Authors: Wei Fang, Yanqi Chen, Jianhao Ding, Zhaofei Yu, Timoth\'ee Masquelier,
Ding Chen, Liwei Huang, Huihui Zhou, Guoqi Li, Yonghong Tian
- Abstract summary: Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
- Score: 51.6943465041708
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on
neuromorphic chips with high energy efficiency by introducing neural dynamics
and spike properties. As the emerging spiking deep learning paradigm attracts
increasing interest, traditional programming frameworks cannot meet the demands
of the automatic differentiation, parallel computation acceleration, and high
integration of processing neuromorphic datasets and deployment. In this work,
we present the SpikingJelly framework to address the aforementioned dilemma. We
contribute a full-stack toolkit for pre-processing neuromorphic datasets,
building deep SNNs, optimizing their parameters, and deploying SNNs on
neuromorphic chips. Compared to existing methods, the training of deep SNNs can
be accelerated $11\times$, and the superior extensibility and flexibility of
SpikingJelly enable users to accelerate custom models at low costs through
multilevel inheritance and semiautomatic code generation. SpikingJelly paves
the way for synthesizing truly energy-efficient SNN-based machine intelligence
systems, which will enrich the ecology of neuromorphic computing.
Related papers
- Towards Efficient Deployment of Hybrid SNNs on Neuromorphic and Edge AI Hardware [0.493599216374976]
This paper explores the synergistic potential of neuromorphic and edge computing to create a versatile machine learning (ML) system tailored for processing data captured by dynamic vision sensors.
We construct and train hybrid models, blending spiking neural networks (SNNs) and artificial neural networks (ANNs) using PyTorch and Lava frameworks.
arXiv Detail & Related papers (2024-07-11T17:40:39Z) - Direct Training High-Performance Deep Spiking Neural Networks: A Review of Theories and Methods [33.377770671553336]
Spiking neural networks (SNNs) offer a promising energy-efficient alternative to artificial neural networks (ANNs)
In this paper, we provide a new perspective to summarize the theories and methods for training deep SNNs with high performance.
arXiv Detail & Related papers (2024-05-06T09:58:54Z) - Spyx: A Library for Just-In-Time Compiled Optimization of Spiking Neural
Networks [0.08965418284317034]
Spiking Neural Networks (SNNs) offer to enhance energy efficiency through a reduced and low-power hardware footprint.
This paper introduces Spyx, a new and lightweight SNN simulation and optimization library designed in JAX.
arXiv Detail & Related papers (2024-02-29T09:46:44Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Deep Reinforcement Learning with Spiking Q-learning [51.386945803485084]
spiking neural networks (SNNs) are expected to realize artificial intelligence (AI) with less energy consumption.
It provides a promising energy-efficient way for realistic control tasks by combining SNNs with deep reinforcement learning (RL)
arXiv Detail & Related papers (2022-01-21T16:42:11Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - In-Hardware Learning of Multilayer Spiking Neural Networks on a
Neuromorphic Processor [6.816315761266531]
This work presents a spike-based backpropagation algorithm with biological plausible local update rules and adapts it to fit the constraint in a neuromorphic hardware.
The algorithm is implemented on Intel Loihi chip enabling low power in- hardware supervised online learning of multilayered SNNs for mobile applications.
arXiv Detail & Related papers (2021-05-08T09:22:21Z) - Neuromorphic Processing and Sensing: Evolutionary Progression of AI to
Spiking [0.0]
Spiking Neural Network algorithms hold the promise to implement advanced artificial intelligence using a fraction of the computations and power requirements.
This paper explains the theoretical workings of neuromorphic technologies based on spikes, and overviews the state-of-art in hardware processors, software platforms and neuromorphic sensing devices.
A progression path is paved for current machine learning specialists to update their skillset, as well as classification or predictive models from the current generation of deep neural networks to SNNs.
arXiv Detail & Related papers (2020-07-10T20:54:42Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.