MAP-SNN: Mapping Spike Activities with Multiplicity, Adaptability, and
Plasticity into Bio-Plausible Spiking Neural Networks
- URL: http://arxiv.org/abs/2204.09893v1
- Date: Thu, 21 Apr 2022 05:36:11 GMT
- Title: MAP-SNN: Mapping Spike Activities with Multiplicity, Adaptability, and
Plasticity into Bio-Plausible Spiking Neural Networks
- Authors: Chengting Yu, Yangkai Du, Mufeng Chen, Aili Wang, Gaoang Wang and
Erping Li
- Abstract summary: Spiking Neural Network (SNN) is considered more biologically realistic and power-efficient as it imitates the fundamental mechanism of the human brain.
We consider three properties in modeling spike activities: Multiplicity, Adaptability, and Plasticity (MAP)
The proposed SNN model achieves competitive performances on neuromorphic datasets: N-MNIST and SHD.
- Score: 4.806663076114504
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking Neural Network (SNN) is considered more biologically realistic and
power-efficient as it imitates the fundamental mechanism of the human brain.
Recently, backpropagation (BP) based SNN learning algorithms that utilize deep
learning frameworks have achieved good performance. However,
bio-interpretability is partially neglected in those BP-based algorithms.
Toward bio-plausible BP-based SNNs, we consider three properties in modeling
spike activities: Multiplicity, Adaptability, and Plasticity (MAP). In terms of
multiplicity, we propose a Multiple-Spike Pattern (MSP) with multiple spike
transmission to strengthen model robustness in discrete time-iteration. To
realize adaptability, we adopt Spike Frequency Adaption (SFA) under MSP to
decrease spike activities for improved efficiency. For plasticity, we propose a
trainable convolutional synapse that models spike response current to enhance
the diversity of spiking neurons for temporal feature extraction. The proposed
SNN model achieves competitive performances on neuromorphic datasets: N-MNIST
and SHD. Furthermore, experimental results demonstrate that the proposed three
aspects are significant to iterative robustness, spike efficiency, and temporal
feature extraction capability of spike activities. In summary, this work
proposes a feasible scheme for bio-inspired spike activities with MAP, offering
a new neuromorphic perspective to embed biological characteristics into spiking
neural networks.
Related papers
- SPikE-SSM: A Sparse, Precise, and Efficient Spiking State Space Model for Long Sequences Learning [21.37966285950504]
Spiking neural networks (SNNs) provide an energy-efficient solution by utilizing the spike-based and sparse nature of biological systems.
Recent emergence of state space models (SSMs) offer superior computational efficiency and modeling capability.
We propose a sparse, precise and efficient spiking SSM framework, termed SPikE-SSM.
arXiv Detail & Related papers (2024-10-07T12:20:38Z) - Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Exploiting Heterogeneity in Timescales for Sparse Recurrent Spiking Neural Networks for Energy-Efficient Edge Computing [16.60622265961373]
Spiking Neural Networks (SNNs) represent the forefront of neuromorphic computing.
This paper weaves together three groundbreaking studies that revolutionize SNN performance.
arXiv Detail & Related papers (2024-07-08T23:33:12Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Exploiting High Performance Spiking Neural Networks with Efficient
Spiking Patterns [4.8416725611508244]
Spiking Neural Networks (SNNs) use discrete spike sequences to transmit information, which significantly mimics the information transmission of the brain.
This paper introduces the dynamic Burst pattern and designs the Leaky Integrate and Fire or Burst (LIFB) neuron that can make a trade-off between short-time performance and dynamic temporal performance.
arXiv Detail & Related papers (2023-01-29T04:22:07Z) - Developmental Plasticity-inspired Adaptive Pruning for Deep Spiking and Artificial Neural Networks [11.730984231143108]
Developmental plasticity plays prominent role in shaping the brain's structure during ongoing learning.
Existing network compression methods for deep artificial neural networks (ANNs) and spiking neural networks (SNNs) draw little inspiration from brain's developmental plasticity mechanisms.
This paper proposes a developmental plasticity-inspired adaptive pruning (DPAP) method, with inspiration from the adaptive developmental pruning of dendritic spines, synapses, and neurons.
arXiv Detail & Related papers (2022-11-23T05:26:51Z) - STSC-SNN: Spatio-Temporal Synaptic Connection with Temporal Convolution
and Attention for Spiking Neural Networks [7.422913384086416]
Spiking Neural Networks (SNNs), as one of the algorithmic models in neuromorphic computing, have gained a great deal of research attention owing to temporal processing capability.
Existing synaptic structures in SNNs are almost full-connections or spatial 2D convolution, neither which can extract temporal dependencies adequately.
We take inspiration from biological synapses and propose a synaptic connection SNN model, to enhance the synapse-temporal receptive fields of synaptic connections.
We show that endowing synaptic models with temporal dependencies can improve the performance of SNNs on classification tasks.
arXiv Detail & Related papers (2022-10-11T08:13:22Z) - Spikformer: When Spiking Neural Network Meets Transformer [102.91330530210037]
We consider two biologically plausible structures, the Spiking Neural Network (SNN) and the self-attention mechanism.
We propose a novel Spiking Self Attention (SSA) as well as a powerful framework, named Spiking Transformer (Spikformer)
arXiv Detail & Related papers (2022-09-29T14:16:49Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.