Efficient Online Learning for Networks of Two-Compartment Spiking
Neurons
- URL: http://arxiv.org/abs/2402.15969v1
- Date: Sun, 25 Feb 2024 03:15:12 GMT
- Title: Efficient Online Learning for Networks of Two-Compartment Spiking
Neurons
- Authors: Yujia Yin, Xinyi Chen, Chenxiang Ma, Jibin Wu, Kay Chen Tan
- Abstract summary: We present a novel online learning method specifically tailored for networks of TC-LIF neurons.
We also propose a refined TC-LIF neuron model called Adaptive TC-LIF, which is carefully designed to enhance temporal information integration.
Our approach successfully preserves the superior sequential modeling capabilities of the TC-LIF neuron while incorporating the training efficiency and hardware friendliness of online learning.
- Score: 23.720523101102593
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The brain-inspired Spiking Neural Networks (SNNs) have garnered considerable
research interest due to their superior performance and energy efficiency in
processing temporal signals. Recently, a novel multi-compartment spiking neuron
model, namely the Two-Compartment LIF (TC-LIF) model, has been proposed and
exhibited a remarkable capacity for sequential modelling. However, training the
TC-LIF model presents challenges stemming from the large memory consumption and
the issue of gradient vanishing associated with the Backpropagation Through
Time (BPTT) algorithm. To address these challenges, online learning
methodologies emerge as a promising solution. Yet, to date, the application of
online learning methods in SNNs has been predominantly confined to simplified
Leaky Integrate-and-Fire (LIF) neuron models. In this paper, we present a novel
online learning method specifically tailored for networks of TC-LIF neurons.
Additionally, we propose a refined TC-LIF neuron model called Adaptive TC-LIF,
which is carefully designed to enhance temporal information integration in
online learning scenarios. Extensive experiments, conducted on various
sequential benchmarks, demonstrate that our approach successfully preserves the
superior sequential modeling capabilities of the TC-LIF neuron while
incorporating the training efficiency and hardware friendliness of online
learning. As a result, it offers a multitude of opportunities to leverage
neuromorphic solutions for processing temporal signals.
Related papers
- Comprehensive Online Training and Deployment for Spiking Neural Networks [40.255762156745405]
Spiking Neural Networks (SNNs) are considered to have enormous potential in the future development of Artificial Intelligence (AI)
The current proposed online training methods cannot tackle the inseparability problem of temporal dependent gradients.
We propose Efficient Multi-Precision Firing (EM-PF) model, which is a family of advanced spiking models based on floating-point spikes and binary synaptic weights.
arXiv Detail & Related papers (2024-10-10T02:39:22Z) - Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Unleashing the Potential of Spiking Neural Networks for Sequential
Modeling with Contextual Embedding [32.25788551849627]
Brain-inspired spiking neural networks (SNNs) have struggled to match their biological counterpart in modeling long-term temporal relationships.
This paper presents a novel Contextual Embedding Leaky Integrate-and-Fire (CE-LIF) spiking neuron model.
arXiv Detail & Related papers (2023-08-29T09:33:10Z) - TC-LIF: A Two-Compartment Spiking Neuron Model for Long-Term Sequential
Modelling [54.97005925277638]
The identification of sensory cues associated with potential opportunities and dangers is frequently complicated by unrelated events that separate useful cues by long delays.
It remains a challenging task for state-of-the-art spiking neural networks (SNNs) to establish long-term temporal dependency between distant cues.
We propose a novel biologically inspired Two-Compartment Leaky Integrate-and-Fire spiking neuron model, dubbed TC-LIF.
arXiv Detail & Related papers (2023-08-25T08:54:41Z) - Online Training Through Time for Spiking Neural Networks [66.7744060103562]
Spiking neural networks (SNNs) are promising brain-inspired energy-efficient models.
Recent progress in training methods has enabled successful deep SNNs on large-scale tasks with low latency.
We propose online training through time (OTTT) for SNNs, which is derived from BPTT to enable forward-in-time learning.
arXiv Detail & Related papers (2022-10-09T07:47:56Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Gone Fishing: Neural Active Learning with Fisher Embeddings [55.08537975896764]
There is an increasing need for active learning algorithms that are compatible with deep neural networks.
This article introduces BAIT, a practical representation of tractable, and high-performing active learning algorithm for neural networks.
arXiv Detail & Related papers (2021-06-17T17:26:31Z) - Brain-Inspired Learning on Neuromorphic Substrates [5.279475826661643]
This article provides a mathematical framework for the design of practical online learning algorithms for neuromorphic substrates.
Specifically, we show a direct connection between Real-Time Recurrent Learning (RTRL) and biologically plausible learning rules for training Spiking Neural Networks (SNNs)
We motivate a sparse approximation based on block-diagonal Jacobians, which reduces the algorithm's computational complexity.
arXiv Detail & Related papers (2020-10-22T17:56:59Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.