Event-Driven Learning for Spiking Neural Networks
- URL: http://arxiv.org/abs/2403.00270v1
- Date: Fri, 1 Mar 2024 04:17:59 GMT
- Title: Event-Driven Learning for Spiking Neural Networks
- Authors: Wenjie Wei, Malu Zhang, Jilin Zhang, Ammar Belatreche, Jibin Wu,
Zijing Xu, Xuerui Qiu, Hong Chen, Yang Yang, Haizhou Li
- Abstract summary: Brain-inspired spiking neural networks (SNNs) have gained prominence in the field of neuromorphic computing.
It remains an open challenge how to effectively benefit from the sparse event-driven property of SNNs to minimize backpropagation learning costs.
We introduce two novel event-driven learning methods: the spike-timing-dependent event-driven (STD-ED) and membrane-potential-dependent event-driven (MPD-ED) algorithms.
- Score: 43.17286932151372
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Brain-inspired spiking neural networks (SNNs) have gained prominence in the
field of neuromorphic computing owing to their low energy consumption during
feedforward inference on neuromorphic hardware. However, it remains an open
challenge how to effectively benefit from the sparse event-driven property of
SNNs to minimize backpropagation learning costs. In this paper, we conduct a
comprehensive examination of the existing event-driven learning algorithms,
reveal their limitations, and propose novel solutions to overcome them.
Specifically, we introduce two novel event-driven learning methods: the
spike-timing-dependent event-driven (STD-ED) and membrane-potential-dependent
event-driven (MPD-ED) algorithms. These proposed algorithms leverage precise
neuronal spike timing and membrane potential, respectively, for effective
learning. The two methods are extensively evaluated on static and neuromorphic
datasets to confirm their superior performance. They outperform existing
event-driven counterparts by up to 2.51% for STD-ED and 6.79% for MPD-ED on the
CIFAR-100 dataset. In addition, we theoretically and experimentally validate
the energy efficiency of our methods on neuromorphic hardware. On-chip learning
experiments achieved a remarkable 30-fold reduction in energy consumption over
time-step-based surrogate gradient methods. The demonstrated efficiency and
efficacy of the proposed event-driven learning methods emphasize their
potential to significantly advance the fields of neuromorphic computing,
offering promising avenues for energy-efficiency applications.
Related papers
- Growing Deep Neural Network Considering with Similarity between Neurons [4.32776344138537]
We explore a novel approach of progressively increasing neuron numbers in compact models during training phases.
We propose a method that reduces feature extraction biases and neuronal redundancy by introducing constraints based on neuron similarity distributions.
Results on CIFAR-10 and CIFAR-100 datasets demonstrated accuracy improvement.
arXiv Detail & Related papers (2024-08-23T11:16:37Z) - Physical formula enhanced multi-task learning for pharmacokinetics prediction [54.13787789006417]
A major challenge for AI-driven drug discovery is the scarcity of high-quality data.
We develop a formula enhanced mul-ti-task learning (PEMAL) method that predicts four key parameters of pharmacokinetics simultaneously.
Our experiments reveal that PEMAL significantly lowers the data demand, compared to typical Graph Neural Networks.
arXiv Detail & Related papers (2024-04-16T07:42:55Z) - A Neuromorphic Approach to Obstacle Avoidance in Robot Manipulation [16.696524554516294]
We develop a neuromorphic approach to obstacle avoidance on a camera-equipped manipulator.
Our approach adapts high-level trajectory plans with reactive maneuvers by processing emulated event data in a convolutional SNN.
Our results motivate incorporating SNN learning, utilizing neuromorphic processors, and further exploring the potential of neuromorphic methods.
arXiv Detail & Related papers (2024-04-08T20:42:10Z) - Accelerating Neural Network Training: A Brief Review [0.5825410941577593]
This study examines innovative approaches to expedite the training process of deep neural networks (DNN)
The research utilizes sophisticated methodologies, including Gradient Accumulation (GA), Automatic Mixed Precision (AMP), and Pin Memory (PM)
arXiv Detail & Related papers (2023-12-15T18:43:45Z) - Evaluating Spiking Neural Network On Neuromorphic Platform For Human
Activity Recognition [2.710807780228189]
Energy efficiency and low latency are crucial requirements for wearable AI-empowered human activity recognition systems.
Spike-based workouts recognition system can achieve a comparable accuracy to popular milliwatt RISC-V bases multi-core processor GAP8 with a traditional neural network.
arXiv Detail & Related papers (2023-08-01T18:59:06Z) - SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural
Networks [56.35403810762512]
Spiking neural networks (SNNs) with event-based computation are promising brain-inspired models for energy-efficient applications on neuromorphic hardware.
We study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method.
arXiv Detail & Related papers (2023-02-01T04:22:59Z) - SpikeDyn: A Framework for Energy-Efficient Spiking Neural Networks with
Continual and Unsupervised Learning Capabilities in Dynamic Environments [14.727296040550392]
Spiking Neural Networks (SNNs) bear the potential of efficient unsupervised and continual learning capabilities because of their biological plausibility.
We propose SpikeDyn, a framework for energy-efficient SNNs with continual and unsupervised learning capabilities in dynamic environments.
arXiv Detail & Related papers (2021-02-28T08:26:23Z) - And/or trade-off in artificial neurons: impact on adversarial robustness [91.3755431537592]
Presence of sufficient number of OR-like neurons in a network can lead to classification brittleness and increased vulnerability to adversarial attacks.
We define AND-like neurons and propose measures to increase their proportion in the network.
Experimental results on the MNIST dataset suggest that our approach holds promise as a direction for further exploration.
arXiv Detail & Related papers (2021-02-15T08:19:05Z) - Towards Efficient Processing and Learning with Spikes: New Approaches
for Multi-Spike Learning [59.249322621035056]
We propose two new multi-spike learning rules which demonstrate better performance over other baselines on various tasks.
In the feature detection task, we re-examine the ability of unsupervised STDP with its limitations being presented.
Our proposed learning rules can reliably solve the task over a wide range of conditions without specific constraints being applied.
arXiv Detail & Related papers (2020-05-02T06:41:20Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.