Heterogeneous Neuronal and Synaptic Dynamics for Spike-Efficient
Unsupervised Learning: Theory and Design Principles
- URL: http://arxiv.org/abs/2302.11618v2
- Date: Sat, 5 Aug 2023 15:22:00 GMT
- Title: Heterogeneous Neuronal and Synaptic Dynamics for Spike-Efficient
Unsupervised Learning: Theory and Design Principles
- Authors: Biswadeep Chakraborty and Saibal Mukhopadhyay
- Abstract summary: We analytically show that the diversity in neurons' integration/relaxation dynamics improves an RSNN's ability to learn more distinct input patterns (higher memory capacity)
We further prove that heterogeneous Spike-Timing-Dependent-Plasticity (STDP) dynamics of synapses reduce spiking activity but preserve memory capacity.
- Score: 13.521272923545409
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper shows that the heterogeneity in neuronal and synaptic dynamics
reduces the spiking activity of a Recurrent Spiking Neural Network (RSNN) while
improving prediction performance, enabling spike-efficient (unsupervised)
learning. We analytically show that the diversity in neurons'
integration/relaxation dynamics improves an RSNN's ability to learn more
distinct input patterns (higher memory capacity), leading to improved
classification and prediction performance. We further prove that heterogeneous
Spike-Timing-Dependent-Plasticity (STDP) dynamics of synapses reduce spiking
activity but preserve memory capacity. The analytical results motivate
Heterogeneous RSNN design using Bayesian optimization to determine
heterogeneity in neurons and synapses to improve $\mathcal{E}$, defined as the
ratio of spiking activity and memory capacity. The empirical results on time
series classification and prediction tasks show that optimized HRSNN increases
performance and reduces spiking activity compared to a homogeneous RSNN.
Related papers
- Artificial Kuramoto Oscillatory Neurons [65.16453738828672]
We introduce Artificial Kuramotoy Neurons (AKOrN) as a dynamical alternative to threshold units.
We show that this idea provides performance improvements across a wide spectrum of tasks.
We believe that these empirical results show the importance of our assumptions at the most basic neuronal level of neural representation.
arXiv Detail & Related papers (2024-10-17T17:47:54Z) - Unveiling the Potential of Spiking Dynamics in Graph Representation Learning through Spatial-Temporal Normalization and Coding Strategies [15.037300421748107]
spiking neural networks (SNNs) have attracted substantial interest due to their potential to replicate the energy-efficient and event-driven processing of neurons.
This work examines the unique properties and benefits of spiking dynamics in enhancing graph representation learning.
We propose a spike-based graph neural network model that incorporates spiking dynamics, enhanced by a novel spatial-temporal feature normalization (STFN) technique.
arXiv Detail & Related papers (2024-07-30T02:53:26Z) - Exploiting Heterogeneity in Timescales for Sparse Recurrent Spiking Neural Networks for Energy-Efficient Edge Computing [16.60622265961373]
Spiking Neural Networks (SNNs) represent the forefront of neuromorphic computing.
This paper weaves together three groundbreaking studies that revolutionize SNN performance.
arXiv Detail & Related papers (2024-07-08T23:33:12Z) - Understanding the Convergence in Balanced Resonate-and-Fire Neurons [1.4186974630564675]
Resonate-and-Fire (RF) neurons are an interesting complementary model for integrator neurons in spiking neural networks (SNNs)
The recently proposed balanced resonate-and-fire (BRF) neuron marked a significant methodological advance in terms of task performance, spiking and parameter efficiency.
This paper aims at providing further intuitions about how and why these convergence advantages emerge.
arXiv Detail & Related papers (2024-06-01T10:04:55Z) - Temporal Spiking Neural Networks with Synaptic Delay for Graph Reasoning [91.29876772547348]
Spiking neural networks (SNNs) are investigated as biologically inspired models of neural computation.
This paper reveals that SNNs, when amalgamated with synaptic delay and temporal coding, are proficient in executing (knowledge) graph reasoning.
arXiv Detail & Related papers (2024-05-27T05:53:30Z) - Hallmarks of Optimization Trajectories in Neural Networks: Directional Exploration and Redundancy [75.15685966213832]
We analyze the rich directional structure of optimization trajectories represented by their pointwise parameters.
We show that training only scalar batchnorm parameters some while into training matches the performance of training the entire network.
arXiv Detail & Related papers (2024-03-12T07:32:47Z) - Expressive architectures enhance interpretability of dynamics-based
neural population models [2.294014185517203]
We evaluate the performance of sequential autoencoders (SAEs) in recovering latent chaotic attractors from simulated neural datasets.
We found that SAEs with widely-used recurrent neural network (RNN)-based dynamics were unable to infer accurate firing rates at the true latent state dimensionality.
arXiv Detail & Related papers (2022-12-07T16:44:26Z) - On the Intrinsic Structures of Spiking Neural Networks [66.57589494713515]
Recent years have emerged a surge of interest in SNNs owing to their remarkable potential to handle time-dependent and event-driven data.
There has been a dearth of comprehensive studies examining the impact of intrinsic structures within spiking computations.
This work delves deep into the intrinsic structures of SNNs, by elucidating their influence on the expressivity of SNNs.
arXiv Detail & Related papers (2022-06-21T09:42:30Z) - Ensemble plasticity and network adaptability in SNNs [0.726437825413781]
Artificial Spiking Neural Networks (ASNNs) promise greater information processing efficiency because of discrete event-based (i.e., spike) computation.
We introduce a novel ensemble learning method based on entropy and network activation, operated exclusively using spiking activity.
It was discovered that pruning lower spike-rate neuron clusters resulted in increased generalization or a predictable decline in performance.
arXiv Detail & Related papers (2022-03-11T01:14:51Z) - Network Diffusions via Neural Mean-Field Dynamics [52.091487866968286]
We propose a novel learning framework for inference and estimation problems of diffusion on networks.
Our framework is derived from the Mori-Zwanzig formalism to obtain an exact evolution of the node infection probabilities.
Our approach is versatile and robust to variations of the underlying diffusion network models.
arXiv Detail & Related papers (2020-06-16T18:45:20Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.