An Unsupervised STDP-based Spiking Neural Network Inspired By
Biologically Plausible Learning Rules and Connections
- URL: http://arxiv.org/abs/2207.02727v2
- Date: Sat, 22 Apr 2023 05:44:22 GMT
- Title: An Unsupervised STDP-based Spiking Neural Network Inspired By
Biologically Plausible Learning Rules and Connections
- Authors: Yiting Dong, Dongcheng Zhao, Yang Li, Yi Zeng
- Abstract summary: Spike-timing-dependent plasticity (STDP) is a general learning rule in the brain, but spiking neural networks (SNNs) trained with STDP alone is inefficient and perform poorly.
We design an adaptive synaptic filter and introduce the adaptive spiking threshold to enrich the representation ability of SNNs.
Our model achieves the current state-of-the-art performance of unsupervised STDP-based SNNs in the MNIST and FashionMNIST datasets.
- Score: 10.188771327458651
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The backpropagation algorithm has promoted the rapid development of deep
learning, but it relies on a large amount of labeled data and still has a large
gap with how humans learn. The human brain can quickly learn various conceptual
knowledge in a self-organized and unsupervised manner, accomplished through
coordinating various learning rules and structures in the human brain.
Spike-timing-dependent plasticity (STDP) is a general learning rule in the
brain, but spiking neural networks (SNNs) trained with STDP alone is
inefficient and perform poorly. In this paper, taking inspiration from
short-term synaptic plasticity, we design an adaptive synaptic filter and
introduce the adaptive spiking threshold as the neuron plasticity to enrich the
representation ability of SNNs. We also introduce an adaptive lateral
inhibitory connection to adjust the spikes balance dynamically to help the
network learn richer features. To speed up and stabilize the training of
unsupervised spiking neural networks, we design a samples temporal batch STDP
(STB-STDP), which updates weights based on multiple samples and moments. By
integrating the above three adaptive mechanisms and STB-STDP, our model greatly
accelerates the training of unsupervised spiking neural networks and improves
the performance of unsupervised SNNs on complex tasks. Our model achieves the
current state-of-the-art performance of unsupervised STDP-based SNNs in the
MNIST and FashionMNIST datasets. Further, we tested on the more complex CIFAR10
dataset, and the results fully illustrate the superiority of our algorithm. Our
model is also the first work to apply unsupervised STDP-based SNNs to CIFAR10.
At the same time, in the small-sample learning scenario, it will far exceed the
supervised ANN using the same structure.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Context Gating in Spiking Neural Networks: Achieving Lifelong Learning through Integration of Local and Global Plasticity [20.589970453110208]
Humans learn multiple tasks in succession with minimal mutual interference, through the context gating mechanism in the prefrontal cortex (PFC)
We propose SNN with context gating trained by the local plasticity rule (CG-SNN) for lifelong learning.
Experiments show that the proposed model is effective in maintaining the past learning experience and has better task-selectivity than other methods during lifelong learning.
arXiv Detail & Related papers (2024-06-04T01:35:35Z) - Fully Spiking Actor Network with Intra-layer Connections for
Reinforcement Learning [51.386945803485084]
We focus on the task where the agent needs to learn multi-dimensional deterministic policies to control.
Most existing spike-based RL methods take the firing rate as the output of SNNs, and convert it to represent continuous action space (i.e., the deterministic policy) through a fully-connected layer.
To develop a fully spiking actor network without any floating-point matrix operations, we draw inspiration from the non-spiking interneurons found in insects.
arXiv Detail & Related papers (2024-01-09T07:31:34Z) - Paired Competing Neurons Improving STDP Supervised Local Learning In Spiking Neural Networks [1.0787328610467803]
Direct training of Spiking Neural Networks (SNNs) on neuromorphic hardware has the potential to significantly reduce the energy consumption of artificial neural network training.
We propose Stabilized Supervised STDP (S2-STDP), a supervised STDP learning rule to train the classification layer of an SNN equipped with unsupervised STDP for feature extraction.
We introduce a training architecture called Paired Competing Neurons (PCN) to further enhance the learning capabilities of our classification layer trained with S2-STDP.
arXiv Detail & Related papers (2023-08-04T08:20:54Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Online Training Through Time for Spiking Neural Networks [66.7744060103562]
Spiking neural networks (SNNs) are promising brain-inspired energy-efficient models.
Recent progress in training methods has enabled successful deep SNNs on large-scale tasks with low latency.
We propose online training through time (OTTT) for SNNs, which is derived from BPTT to enable forward-in-time learning.
arXiv Detail & Related papers (2022-10-09T07:47:56Z) - Heterogeneous Recurrent Spiking Neural Network for Spatio-Temporal
Classification [13.521272923545409]
Spi Neural Networks are often touted as brain-inspired learning models for the third wave of Artificial Intelligence.
This paper presents a heterogeneous spiking neural network (HRSNN) with unsupervised learning for video recognition tasks.
We show that HRSNN can achieve similar performance to state-of-the-temporal backpropagation trained supervised SNN, but with less computation.
arXiv Detail & Related papers (2022-09-22T16:34:01Z) - An STDP-Based Supervised Learning Algorithm for Spiking Neural Networks [20.309112286222238]
Spiking Neural Networks (SNN) provide a more biological plausible model for the brain.
We propose a supervised learning algorithm based on Spike-Timing Dependent Plasticity (STDP) for a hierarchical SNN consisting of Leaky Integrate-and-fire neurons.
arXiv Detail & Related papers (2022-03-07T13:40:09Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.