Ensemble plasticity and network adaptability in SNNs
- URL: http://arxiv.org/abs/2203.07039v1
- Date: Fri, 11 Mar 2022 01:14:51 GMT
- Title: Ensemble plasticity and network adaptability in SNNs
- Authors: Mahima Milinda Alwis Weerasinghe, David Parry, Grace Wang, Jacqueline
Whalley
- Abstract summary: Artificial Spiking Neural Networks (ASNNs) promise greater information processing efficiency because of discrete event-based (i.e., spike) computation.
We introduce a novel ensemble learning method based on entropy and network activation, operated exclusively using spiking activity.
It was discovered that pruning lower spike-rate neuron clusters resulted in increased generalization or a predictable decline in performance.
- Score: 0.726437825413781
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Artificial Spiking Neural Networks (ASNNs) promise greater information
processing efficiency because of discrete event-based (i.e., spike)
computation. Several Machine Learning (ML) applications use biologically
inspired plasticity mechanisms as unsupervised learning techniques to increase
the robustness of ASNNs while preserving efficiency. Spike Time Dependent
Plasticity (STDP) and Intrinsic Plasticity (IP) (i.e., dynamic spiking
threshold adaptation) are two such mechanisms that have been combined to form
an ensemble learning method. However, it is not clear how this ensemble
learning should be regulated based on spiking activity. Moreover, previous
studies have attempted threshold based synaptic pruning following STDP, to
increase inference efficiency at the cost of performance in ASNNs. However,
this type of structural adaptation, that employs individual weight mechanisms,
does not consider spiking activity for pruning which is a better representation
of input stimuli. We envisaged that plasticity-based spike-regulation and
spike-based pruning will result in ASSNs that perform better in low resource
situations. In this paper, a novel ensemble learning method based on entropy
and network activation is introduced, which is amalgamated with a spike-rate
neuron pruning technique, operated exclusively using spiking activity. Two
electroencephalography (EEG) datasets are used as the input for classification
experiments with a three-layer feed forward ASNN trained using one-pass
learning. During the learning process, we observed neurons assembling into a
hierarchy of clusters based on spiking rate. It was discovered that pruning
lower spike-rate neuron clusters resulted in increased generalization or a
predictable decline in performance.
Related papers
- Benchmarking Spiking Neural Network Learning Methods with Varying
Locality [2.323924801314763]
Spiking Neural Networks (SNNs) provide more realistic neuronal dynamics.
Information is processed as spikes within SNNs in an event-based mechanism.
We show that training SNNs is challenging due to the non-differentiable nature of the spiking mechanism.
arXiv Detail & Related papers (2024-02-01T19:57:08Z) - Activity Sparsity Complements Weight Sparsity for Efficient RNN
Inference [2.0822643340897273]
We show that activity sparsity can compose multiplicatively with parameter sparsity in a recurrent neural network model.
We achieve up to $20times$ reduction of computation while maintaining perplexities below $60$ on the Penn Treebank language modeling task.
arXiv Detail & Related papers (2023-11-13T08:18:44Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - Heterogeneous Neuronal and Synaptic Dynamics for Spike-Efficient
Unsupervised Learning: Theory and Design Principles [13.521272923545409]
We analytically show that the diversity in neurons' integration/relaxation dynamics improves an RSNN's ability to learn more distinct input patterns (higher memory capacity)
We further prove that heterogeneous Spike-Timing-Dependent-Plasticity (STDP) dynamics of synapses reduce spiking activity but preserve memory capacity.
arXiv Detail & Related papers (2023-02-22T19:48:02Z) - SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural
Networks [56.35403810762512]
Spiking neural networks (SNNs) with event-based computation are promising brain-inspired models for energy-efficient applications on neuromorphic hardware.
We study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method.
arXiv Detail & Related papers (2023-02-01T04:22:59Z) - An Unsupervised STDP-based Spiking Neural Network Inspired By
Biologically Plausible Learning Rules and Connections [10.188771327458651]
Spike-timing-dependent plasticity (STDP) is a general learning rule in the brain, but spiking neural networks (SNNs) trained with STDP alone is inefficient and perform poorly.
We design an adaptive synaptic filter and introduce the adaptive spiking threshold to enrich the representation ability of SNNs.
Our model achieves the current state-of-the-art performance of unsupervised STDP-based SNNs in the MNIST and FashionMNIST datasets.
arXiv Detail & Related papers (2022-07-06T14:53:32Z) - On the Intrinsic Structures of Spiking Neural Networks [66.57589494713515]
Recent years have emerged a surge of interest in SNNs owing to their remarkable potential to handle time-dependent and event-driven data.
There has been a dearth of comprehensive studies examining the impact of intrinsic structures within spiking computations.
This work delves deep into the intrinsic structures of SNNs, by elucidating their influence on the expressivity of SNNs.
arXiv Detail & Related papers (2022-06-21T09:42:30Z) - SpikePropamine: Differentiable Plasticity in Spiking Neural Networks [0.0]
We introduce a framework for learning the dynamics of synaptic plasticity and neuromodulated synaptic plasticity in Spiking Neural Networks (SNNs)
We show that SNNs augmented with differentiable plasticity are sufficient for solving a set of challenging temporal learning tasks.
These networks are also shown to be capable of producing locomotion on a high-dimensional robotic learning task.
arXiv Detail & Related papers (2021-06-04T19:29:07Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z) - Towards Efficient Processing and Learning with Spikes: New Approaches
for Multi-Spike Learning [59.249322621035056]
We propose two new multi-spike learning rules which demonstrate better performance over other baselines on various tasks.
In the feature detection task, we re-examine the ability of unsupervised STDP with its limitations being presented.
Our proposed learning rules can reliably solve the task over a wide range of conditions without specific constraints being applied.
arXiv Detail & Related papers (2020-05-02T06:41:20Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.