Dive into the Power of Neuronal Heterogeneity
- URL: http://arxiv.org/abs/2305.11484v2
- Date: Fri, 13 Oct 2023 06:17:52 GMT
- Title: Dive into the Power of Neuronal Heterogeneity
- Authors: Guobin Shen, Dongcheng Zhao, Yiting Dong, Yang Li, Yi Zeng
- Abstract summary: We show the challenges faced by backpropagation-based methods in optimizing Spiking Neural Networks (SNNs) and achieve more robust optimization of heterogeneous neurons in random networks using an Evolutionary Strategy (ES)
We find that membrane time constants play a crucial role in neural heterogeneity, and their distribution is similar to that observed in biological experiments.
- Score: 8.6837371869842
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The biological neural network is a vast and diverse structure with high
neural heterogeneity. Conventional Artificial Neural Networks (ANNs) primarily
focus on modifying the weights of connections through training while modeling
neurons as highly homogenized entities and lacking exploration of neural
heterogeneity. Only a few studies have addressed neural heterogeneity by
optimizing neuronal properties and connection weights to ensure network
performance. However, this strategy impact the specific contribution of
neuronal heterogeneity. In this paper, we first demonstrate the challenges
faced by backpropagation-based methods in optimizing Spiking Neural Networks
(SNNs) and achieve more robust optimization of heterogeneous neurons in random
networks using an Evolutionary Strategy (ES). Experiments on tasks such as
working memory, continuous control, and image recognition show that neuronal
heterogeneity can improve performance, particularly in long sequence tasks.
Moreover, we find that membrane time constants play a crucial role in neural
heterogeneity, and their distribution is similar to that observed in biological
experiments. Therefore, we believe that the neglected neuronal heterogeneity
plays an essential role, providing new approaches for exploring neural
heterogeneity in biology and new ways for designing more biologically plausible
neural networks.
Related papers
- Artificial Kuramoto Oscillatory Neurons [65.16453738828672]
We introduce Artificial Kuramotoy Neurons (AKOrN) as a dynamical alternative to threshold units.
We show that this idea provides performance improvements across a wide spectrum of tasks.
We believe that these empirical results show the importance of our assumptions at the most basic neuronal level of neural representation.
arXiv Detail & Related papers (2024-10-17T17:47:54Z) - Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Exploiting Heterogeneity in Timescales for Sparse Recurrent Spiking Neural Networks for Energy-Efficient Edge Computing [16.60622265961373]
Spiking Neural Networks (SNNs) represent the forefront of neuromorphic computing.
This paper weaves together three groundbreaking studies that revolutionize SNN performance.
arXiv Detail & Related papers (2024-07-08T23:33:12Z) - Enhancing learning in spiking neural networks through neuronal heterogeneity and neuromodulatory signaling [52.06722364186432]
We propose a biologically-informed framework for enhancing artificial neural networks (ANNs)
Our proposed dual-framework approach highlights the potential of spiking neural networks (SNNs) for emulating diverse spiking behaviors.
We outline how the proposed approach integrates brain-inspired compartmental models and task-driven SNNs, bioinspiration and complexity.
arXiv Detail & Related papers (2024-07-05T14:11:28Z) - Growing Artificial Neural Networks for Control: the Role of Neuronal Diversity [7.479827648985631]
In biological evolution complex neural structures grow from a handful of cellular ingredients.
This self-organisation is hypothesized to play an important part in the generalisation, and robustness of biological neural networks.
We present an algorithm for growing artificial neural networks that solve reinforcement learning tasks.
arXiv Detail & Related papers (2024-05-14T11:21:52Z) - Mitigating Communication Costs in Neural Networks: The Role of Dendritic
Nonlinearity [28.243134476634125]
In this study, we scrutinized the importance of nonlinear dendrites within neural networks.
Our findings reveal that integrating dendritic structures can substantially enhance model capacity and performance.
arXiv Detail & Related papers (2023-06-21T00:28:20Z) - Learning to Act through Evolution of Neural Diversity in Random Neural
Networks [9.387749254963595]
In most artificial neural networks (ANNs), neural computation is abstracted to an activation function that is usually shared between all neurons.
We propose the optimization of neuro-centric parameters to attain a set of diverse neurons that can perform complex computations.
arXiv Detail & Related papers (2023-05-25T11:33:04Z) - Connected Hidden Neurons (CHNNet): An Artificial Neural Network for
Rapid Convergence [0.6218519716921521]
We propose a more robust model of artificial neural networks where the hidden neurons, residing in the same hidden layer, are interconnected that leads to rapid convergence.
With the experimental study of our proposed model in deep networks, we demonstrate that the model results in a noticeable increase in convergence rate compared to the conventional feed-forward neural network.
arXiv Detail & Related papers (2023-05-17T14:00:38Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.