Multi-compartment Neuron and Population Encoding improved Spiking Neural
Network for Deep Distributional Reinforcement Learning
- URL: http://arxiv.org/abs/2301.07275v1
- Date: Wed, 18 Jan 2023 02:45:38 GMT
- Title: Multi-compartment Neuron and Population Encoding improved Spiking Neural
Network for Deep Distributional Reinforcement Learning
- Authors: Yinqian Sun, Yi Zeng, Feifei Zhao and Zhuoya Zhao
- Abstract summary: Spiking neural networks (SNNs) exhibit significant low energy consumption and are more suitable for incorporating multi-scale biological characteristics.
In this paper, we propose a brain-inspired SNN-based deep distributional reinforcement learning algorithm with combination of bio-inspired multi-compartment neuron (MCN) model and population coding method.
- Score: 3.036382664997076
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Inspired by the information processing with binary spikes in the brain, the
spiking neural networks (SNNs) exhibit significant low energy consumption and
are more suitable for incorporating multi-scale biological characteristics.
Spiking Neurons, as the basic information processing unit of SNNs, are often
simplified in most SNNs which only consider LIF point neuron and do not take
into account the multi-compartmental structural properties of biological
neurons. This limits the computational and learning capabilities of SNNs. In
this paper, we proposed a brain-inspired SNN-based deep distributional
reinforcement learning algorithm with combination of bio-inspired
multi-compartment neuron (MCN) model and population coding method. The proposed
multi-compartment neuron built the structure and function of apical dendritic,
basal dendritic, and somatic computing compartments to achieve the
computational power close to that of biological neurons. Besides, we present an
implicit fractional embedding method based on spiking neuron population
encoding. We tested our model on Atari games, and the experiment results show
that the performance of our model surpasses the vanilla ANN-based FQF model and
ANN-SNN conversion method based Spiking-FQF models. The ablation experiments
show that the proposed multi-compartment neural model and quantile fraction
implicit population spike representation play an important role in realizing
SNN-based deep distributional reinforcement learning.
Related papers
- Enhancing learning in spiking neural networks through neuronal heterogeneity and neuromodulatory signaling [52.06722364186432]
We propose a biologically-informed framework for enhancing artificial neural networks (ANNs)
Our proposed dual-framework approach highlights the potential of spiking neural networks (SNNs) for emulating diverse spiking behaviors.
We outline how the proposed approach integrates brain-inspired compartmental models and task-driven SNNs, bioinspiration and complexity.
arXiv Detail & Related papers (2024-07-05T14:11:28Z) - A survey on learning models of spiking neural membrane systems and spiking neural networks [0.0]
Spiking neural networks (SNN) are a biologically inspired model of neural networks with certain brain-like properties.
In SNN, communication between neurons takes place through the spikes and spike trains.
SNPS can be considered a branch of SNN based more on the principles of formal automata.
arXiv Detail & Related papers (2024-03-27T14:26:41Z) - Deep Pulse-Coupled Neural Networks [31.65350290424234]
Neural Networks (SNNs) capture the information processing mechanism of the brain by taking advantage of neurons.
In this work, we leverage a more biologically plausible neural model with complex dynamics, i.e., a pulse-coupled neural network (PCNN)
We construct deep pulse-coupled neural networks (DPCNNs) by replacing commonly used LIF neurons in SNNs with PCNN neurons.
arXiv Detail & Related papers (2023-12-24T08:26:00Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Exploiting Noise as a Resource for Computation and Learning in Spiking
Neural Networks [32.0086664373154]
This study introduces the noisy spiking neural network (NSNN) and the noise-driven learning rule (NDL)
NSNN provides a theoretical framework that yields scalable, flexible, and reliable computation.
arXiv Detail & Related papers (2023-05-25T13:21:26Z) - Multi-scale Evolutionary Neural Architecture Search for Deep Spiking
Neural Networks [7.271032282434803]
We propose a Multi-Scale Evolutionary Neural Architecture Search (MSE-NAS) for Spiking Neural Networks (SNNs)
MSE-NAS evolves individual neuron operation, self-organized integration of multiple circuit motifs, and global connectivity across motifs through a brain-inspired indirect evaluation function, Representational Dissimilarity Matrices (RDMs)
The proposed algorithm achieves state-of-the-art (SOTA) performance with shorter simulation steps on static datasets and neuromorphic datasets.
arXiv Detail & Related papers (2023-04-21T05:36:37Z) - Extrapolation and Spectral Bias of Neural Nets with Hadamard Product: a
Polynomial Net Study [55.12108376616355]
The study on NTK has been devoted to typical neural network architectures, but is incomplete for neural networks with Hadamard products (NNs-Hp)
In this work, we derive the finite-width-K formulation for a special class of NNs-Hp, i.e., neural networks.
We prove their equivalence to the kernel regression predictor with the associated NTK, which expands the application scope of NTK.
arXiv Detail & Related papers (2022-09-16T06:36:06Z) - SIT: A Bionic and Non-Linear Neuron for Spiking Neural Network [12.237928453571636]
Spiking Neural Networks (SNNs) have piqued researchers' interest because of their capacity to process temporal information and low power consumption.
Current state-of-the-art methods limited their biological plausibility and performance because their neurons are generally built on the simple Leaky-Integrate-and-Fire (LIF) model.
Due to the high level of dynamic complexity, modern neuron models have seldom been implemented in SNN practice.
arXiv Detail & Related papers (2022-03-30T07:50:44Z) - Deep Reinforcement Learning Guided Graph Neural Networks for Brain
Network Analysis [61.53545734991802]
We propose a novel brain network representation framework, namely BN-GNN, which searches for the optimal GNN architecture for each brain network.
Our proposed BN-GNN improves the performance of traditional GNNs on different brain network analysis tasks.
arXiv Detail & Related papers (2022-03-18T07:05:27Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.