Multi-compartment Neuron and Population Encoding Powered Spiking Neural Network for Deep Distributional Reinforcement Learning
- URL: http://arxiv.org/abs/2301.07275v2
- Date: Tue, 25 Feb 2025 02:49:20 GMT
- Title: Multi-compartment Neuron and Population Encoding Powered Spiking Neural Network for Deep Distributional Reinforcement Learning
- Authors: Yinqian Sun, Feifei Zhao, Zhuoya Zhao, Yi Zeng,
- Abstract summary: In spiking neural networks (SNNs), spiking neurons serve as the fundamental information processing units.<n>In this paper, we propose a brain-inspired deep distributional reinforcement learning algorithm based on SNNs.
- Score: 6.0483672878162515
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Inspired by the brain's information processing using binary spikes, spiking neural networks (SNNs) offer significant reductions in energy consumption and are more adept at incorporating multi-scale biological characteristics. In SNNs, spiking neurons serve as the fundamental information processing units. However, in most models, these neurons are typically simplified, focusing primarily on the leaky integrate-and-fire (LIF) point neuron model while neglecting the structural properties of biological neurons. This simplification hampers the computational and learning capabilities of SNNs. In this paper, we propose a brain-inspired deep distributional reinforcement learning algorithm based on SNNs, which integrates a bio-inspired multi-compartment neuron (MCN) model with a population coding approach. The proposed MCN model simulates the structure and function of apical dendritic, basal dendritic, and somatic compartments, achieving computational power comparable to that of biological neurons. Additionally, we introduce an implicit fractional embedding method based on population coding of spiking neurons. We evaluated our model on Atari games, and the experimental results demonstrate that it surpasses the vanilla FQF model, which utilizes traditional artificial neural networks (ANNs), as well as the Spiking-FQF models that are based on ANN-to-SNN conversion methods. Ablation studies further reveal that the proposed multi-compartment neuron model and the quantile fraction implicit population spike representation significantly enhance the performance of MCS-FQF while also reducing power consumption.
Related papers
- Spiking World Model with Multi-Compartment Neurons for Model-based Reinforcement Learning [6.0483672878162515]
Brain-inspired spiking neural networks (SNNs) have garnered significant research attention in algorithm design and perception applications.
However, their potential in the decision-making domain, particularly in model-based reinforcement learning, remains underexplored.
We propose a multi-compartment neuron model capable of nonlinearly integrating information from multiple dendritic sources to dynamically process long sequential inputs.
arXiv Detail & Related papers (2025-03-02T03:40:10Z) - Channel-wise Parallelizable Spiking Neuron with Multiplication-free Dynamics and Large Temporal Receptive Fields [32.349167886062105]
Spiking Neural Networks (SNNs) are distinguished from Artificial Neural Networks (ANNs) for their sophisticated neuronal dynamics and sparse binary activations (spikes) inspired by the biological neural system.
Traditional neuron models use iterative step-by-step dynamics, resulting in serial computation and slow training speed of SNNs.
Recent parallelizable spiking neuron models have been proposed to fully utilize the massive parallel computing ability of graphics processing units to accelerate the training of SNNs.
arXiv Detail & Related papers (2025-01-24T13:44:08Z) - Deep-Unrolling Multidimensional Harmonic Retrieval Algorithms on Neuromorphic Hardware [78.17783007774295]
This paper explores the potential of conversion-based neuromorphic algorithms for highly accurate and energy-efficient single-snapshot multidimensional harmonic retrieval.
A novel method for converting the complex-valued convolutional layers and activations into spiking neural networks (SNNs) is developed.
The converted SNNs achieve almost five-fold power efficiency at moderate performance loss compared to the original CNNs.
arXiv Detail & Related papers (2024-12-05T09:41:33Z) - Enhancing learning in spiking neural networks through neuronal heterogeneity and neuromodulatory signaling [52.06722364186432]
We propose a biologically-informed framework for enhancing artificial neural networks (ANNs)
Our proposed dual-framework approach highlights the potential of spiking neural networks (SNNs) for emulating diverse spiking behaviors.
We outline how the proposed approach integrates brain-inspired compartmental models and task-driven SNNs, bioinspiration and complexity.
arXiv Detail & Related papers (2024-07-05T14:11:28Z) - A survey on learning models of spiking neural membrane systems and spiking neural networks [0.0]
Spiking neural networks (SNN) are a biologically inspired model of neural networks with certain brain-like properties.
In SNN, communication between neurons takes place through the spikes and spike trains.
SNPS can be considered a branch of SNN based more on the principles of formal automata.
arXiv Detail & Related papers (2024-03-27T14:26:41Z) - Deep Pulse-Coupled Neural Networks [31.65350290424234]
Neural Networks (SNNs) capture the information processing mechanism of the brain by taking advantage of neurons.
In this work, we leverage a more biologically plausible neural model with complex dynamics, i.e., a pulse-coupled neural network (PCNN)
We construct deep pulse-coupled neural networks (DPCNNs) by replacing commonly used LIF neurons in SNNs with PCNN neurons.
arXiv Detail & Related papers (2023-12-24T08:26:00Z) - Fully Spiking Denoising Diffusion Implicit Models [61.32076130121347]
Spiking neural networks (SNNs) have garnered considerable attention owing to their ability to run on neuromorphic devices with super-high speeds.
We propose a novel approach fully spiking denoising diffusion implicit model (FSDDIM) to construct a diffusion model within SNNs.
We demonstrate that the proposed method outperforms the state-of-the-art fully spiking generative model.
arXiv Detail & Related papers (2023-12-04T09:07:09Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Exploiting Noise as a Resource for Computation and Learning in Spiking
Neural Networks [32.0086664373154]
This study introduces the noisy spiking neural network (NSNN) and the noise-driven learning rule (NDL)
NSNN provides a theoretical framework that yields scalable, flexible, and reliable computation.
arXiv Detail & Related papers (2023-05-25T13:21:26Z) - Multi-scale Evolutionary Neural Architecture Search for Deep Spiking
Neural Networks [7.271032282434803]
We propose a Multi-Scale Evolutionary Neural Architecture Search (MSE-NAS) for Spiking Neural Networks (SNNs)
MSE-NAS evolves individual neuron operation, self-organized integration of multiple circuit motifs, and global connectivity across motifs through a brain-inspired indirect evaluation function, Representational Dissimilarity Matrices (RDMs)
The proposed algorithm achieves state-of-the-art (SOTA) performance with shorter simulation steps on static datasets and neuromorphic datasets.
arXiv Detail & Related papers (2023-04-21T05:36:37Z) - Extrapolation and Spectral Bias of Neural Nets with Hadamard Product: a
Polynomial Net Study [55.12108376616355]
The study on NTK has been devoted to typical neural network architectures, but is incomplete for neural networks with Hadamard products (NNs-Hp)
In this work, we derive the finite-width-K formulation for a special class of NNs-Hp, i.e., neural networks.
We prove their equivalence to the kernel regression predictor with the associated NTK, which expands the application scope of NTK.
arXiv Detail & Related papers (2022-09-16T06:36:06Z) - SIT: A Bionic and Non-Linear Neuron for Spiking Neural Network [12.237928453571636]
Spiking Neural Networks (SNNs) have piqued researchers' interest because of their capacity to process temporal information and low power consumption.
Current state-of-the-art methods limited their biological plausibility and performance because their neurons are generally built on the simple Leaky-Integrate-and-Fire (LIF) model.
Due to the high level of dynamic complexity, modern neuron models have seldom been implemented in SNN practice.
arXiv Detail & Related papers (2022-03-30T07:50:44Z) - Deep Reinforcement Learning Guided Graph Neural Networks for Brain
Network Analysis [61.53545734991802]
We propose a novel brain network representation framework, namely BN-GNN, which searches for the optimal GNN architecture for each brain network.
Our proposed BN-GNN improves the performance of traditional GNNs on different brain network analysis tasks.
arXiv Detail & Related papers (2022-03-18T07:05:27Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.