A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks
- URL: http://arxiv.org/abs/2305.16594v2
- Date: Thu, 4 Jan 2024 02:23:07 GMT
- Title: A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks
- Authors: Xinyi Chen, Qu Yang, Jibin Wu, Haizhou Li, and Kay Chen Tan
- Abstract summary: Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
- Score: 53.31941519245432
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, brain-inspired spiking neural networks (SNNs) have demonstrated
promising capabilities in solving pattern recognition tasks. However, these
SNNs are grounded on homogeneous neurons that utilize a uniform neural coding
for information representation. Given that each neural coding scheme possesses
its own merits and drawbacks, these SNNs encounter challenges in achieving
optimal performance such as accuracy, response time, efficiency, and
robustness, all of which are crucial for practical applications. In this study,
we argue that SNN architectures should be holistically designed to incorporate
heterogeneous coding schemes. As an initial exploration in this direction, we
propose a hybrid neural coding and learning framework, which encompasses a
neural coding zoo with diverse neural coding schemes discovered in
neuroscience. Additionally, it incorporates a flexible neural coding assignment
strategy to accommodate task-specific requirements, along with novel layer-wise
learning methods to effectively implement hybrid coding SNNs. We demonstrate
the superiority of the proposed framework on image classification and sound
localization tasks. Specifically, the proposed hybrid coding SNNs achieve
comparable accuracy to state-of-the-art SNNs, while exhibiting significantly
reduced inference latency and energy consumption, as well as high noise
robustness. This study yields valuable insights into hybrid neural coding
designs, paving the way for developing high-performance neuromorphic systems.
Related papers
- Stepwise Weighted Spike Coding for Deep Spiking Neural Networks [7.524721345903027]
Spiking Neural Networks (SNNs) seek to mimic the spiking behavior of biological neurons.
We propose a novel Stepwise Weighted Spike (SWS) coding scheme to enhance the encoding of information in spikes.
This approach compresses the spikes by weighting the significance of the spike in each step of neural computation, achieving high performance and low energy consumption.
arXiv Detail & Related papers (2024-08-30T12:39:25Z) - Stochastic Spiking Neural Networks with First-to-Spike Coding [7.955633422160267]
Spiking Neural Networks (SNNs) are known for their bio-plausibility and energy efficiency.
In this work, we explore the merger of novel computing and information encoding schemes in SNN architectures.
We investigate the tradeoffs of our proposal in terms of accuracy, inference latency, spiking sparsity, energy consumption, and datasets.
arXiv Detail & Related papers (2024-04-26T22:52:23Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - Exploiting Noise as a Resource for Computation and Learning in Spiking
Neural Networks [32.0086664373154]
This study introduces the noisy spiking neural network (NSNN) and the noise-driven learning rule (NDL)
NSNN provides a theoretical framework that yields scalable, flexible, and reliable computation.
arXiv Detail & Related papers (2023-05-25T13:21:26Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Deep Reinforcement Learning Guided Graph Neural Networks for Brain
Network Analysis [61.53545734991802]
We propose a novel brain network representation framework, namely BN-GNN, which searches for the optimal GNN architecture for each brain network.
Our proposed BN-GNN improves the performance of traditional GNNs on different brain network analysis tasks.
arXiv Detail & Related papers (2022-03-18T07:05:27Z) - Spiking Neural Networks for Visual Place Recognition via Weighted
Neuronal Assignments [24.754429120321365]
Spiking neural networks (SNNs) offer both compelling potential advantages, including energy efficiency and low latencies.
One promising area for high performance SNNs is template matching and image recognition.
This research introduces the first high performance SNN for the Visual Place Recognition (VPR) task.
arXiv Detail & Related papers (2021-09-14T05:40:40Z) - Accurate and efficient time-domain classification with adaptive spiking
recurrent neural networks [1.8515971640245998]
Spiking neural networks (SNNs) have been investigated as more biologically plausible and potentially more powerful models of neural computation.
We show how a novel surrogate gradient combined with recurrent networks of tunable and adaptive spiking neurons yields state-of-the-art for SNNs.
arXiv Detail & Related papers (2021-03-12T10:27:29Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.