Adaptive Axonal Delays in feedforward spiking neural networks for
accurate spoken word recognition
- URL: http://arxiv.org/abs/2302.08607v1
- Date: Thu, 16 Feb 2023 22:19:04 GMT
- Title: Adaptive Axonal Delays in feedforward spiking neural networks for
accurate spoken word recognition
- Authors: Pengfei Sun, Ehsan Eqlimi, Yansong Chua, Paul Devos, Dick Botteldooren
- Abstract summary: Spiking neural networks (SNN) are a promising research avenue for building accurate and efficient automatic speech recognition systems.
Recent advances in audio-to-spike encoding and training algorithms enable SNN to be applied in practical tasks.
Our work illustrates the potential of training axonal delays for tasks with complex temporal structures.
- Score: 4.018601183900039
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spiking neural networks (SNN) are a promising research avenue for building
accurate and efficient automatic speech recognition systems. Recent advances in
audio-to-spike encoding and training algorithms enable SNN to be applied in
practical tasks. Biologically-inspired SNN communicates using sparse
asynchronous events. Therefore, spike-timing is critical to SNN performance. In
this aspect, most works focus on training synaptic weights and few have
considered delays in event transmission, namely axonal delay. In this work, we
consider a learnable axonal delay capped at a maximum value, which can be
adapted according to the axonal delay distribution in each network layer. We
show that our proposed method achieves the best classification results reported
on the SHD dataset (92.45%) and NTIDIGITS dataset (95.09%). Our work
illustrates the potential of training axonal delays for tasks with complex
temporal structures.
Related papers
- TSkips: Efficiency Through Explicit Temporal Delay Connections in Spiking Neural Networks [8.13696328386179]
We propose TSkips, augmenting Spiking Neural Networks with forward and backward skip connections that incorporate explicit temporal delays.
These connections capture long-term-temporal architectures and facilitate better spike flow over long sequences.
We demonstrate the effectiveness of our approach on four event-based datasets.
arXiv Detail & Related papers (2024-11-22T18:58:18Z) - Temporal Spiking Neural Networks with Synaptic Delay for Graph Reasoning [91.29876772547348]
Spiking neural networks (SNNs) are investigated as biologically inspired models of neural computation.
This paper reveals that SNNs, when amalgamated with synaptic delay and temporal coding, are proficient in executing (knowledge) graph reasoning.
arXiv Detail & Related papers (2024-05-27T05:53:30Z) - LC-TTFS: Towards Lossless Network Conversion for Spiking Neural Networks
with TTFS Coding [55.64533786293656]
We show that our algorithm can achieve a near-perfect mapping between the activation values of an ANN and the spike times of an SNN on a number of challenging AI tasks.
The study paves the way for deploying ultra-low-power TTFS-based SNNs on power-constrained edge computing platforms.
arXiv Detail & Related papers (2023-10-23T14:26:16Z) - Low Latency of object detection for spikng neural network [3.404826786562694]
Spiking Neural Networks are well-suited for edge AI applications due to their binary spike nature.
In this paper, we focus on generating highly accurate and low-latency SNNs specifically for object detection.
arXiv Detail & Related papers (2023-09-27T10:26:19Z) - Co-learning synaptic delays, weights and adaptation in spiking neural
networks [0.0]
Spiking neural networks (SNN) distinguish themselves from artificial neural networks (ANN) because of their inherent temporal processing and spike-based computations.
We show that data processing with spiking neurons can be enhanced by co-learning the connection weights with two other biologically inspired neuronal features.
arXiv Detail & Related papers (2023-09-12T09:13:26Z) - Learning Delays in Spiking Neural Networks using Dilated Convolutions
with Learnable Spacings [1.534667887016089]
Spiking Neural Networks (SNNs) are promising research direction for building power-efficient information processing systems.
In SNNs, delays refer to the time needed for one spike to travel from one neuron to another.
It has been shown theoretically that plastic delays greatly increase the expressivity in SNNs.
We propose a new discrete-time algorithm that addresses this issue in deep feedforward SNNs using backpropagation.
arXiv Detail & Related papers (2023-06-30T14:01:53Z) - Knowing When to Stop: Delay-Adaptive Spiking Neural Network Classifiers with Reliability Guarantees [36.14499894307206]
Spiking neural networks (SNNs) process time-series data via internal event-driven neural dynamics.
We introduce a novel delay-adaptive SNN-based inference methodology that provides guaranteed reliability for the decisions produced at input-dependent stopping times.
arXiv Detail & Related papers (2023-05-18T22:11:04Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Deep Time Delay Neural Network for Speech Enhancement with Full Data
Learning [60.20150317299749]
This paper proposes a deep time delay neural network (TDNN) for speech enhancement with full data learning.
To make full use of the training data, we propose a full data learning method for speech enhancement.
arXiv Detail & Related papers (2020-11-11T06:32:37Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.