Efficient spike encoding algorithms for neuromorphic speech recognition
- URL: http://arxiv.org/abs/2207.07073v1
- Date: Thu, 14 Jul 2022 17:22:07 GMT
- Title: Efficient spike encoding algorithms for neuromorphic speech recognition
- Authors: Sidi Yaya Arnaud Yarga, Jean Rouat, Sean U. N. Wood
- Abstract summary: Spiking Neural Networks (SNN) are very effective for neuromorphic processor implementations.
Real-valued signals are encoded as real-valued signals that are not well-suited to SNN.
In this paper, we study four spike encoding methods in the context of a speaker independent digit classification system.
- Score: 5.182266520875928
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking Neural Networks (SNN) are known to be very effective for neuromorphic
processor implementations, achieving orders of magnitude improvements in energy
efficiency and computational latency over traditional deep learning approaches.
Comparable algorithmic performance was recently made possible as well with the
adaptation of supervised training algorithms to the context of SNN. However,
information including audio, video, and other sensor-derived data are typically
encoded as real-valued signals that are not well-suited to SNN, preventing the
network from leveraging spike timing information. Efficient encoding from
real-valued signals to spikes is therefore critical and significantly impacts
the performance of the overall system. To efficiently encode signals into
spikes, both the preservation of information relevant to the task at hand as
well as the density of the encoded spikes must be considered. In this paper, we
study four spike encoding methods in the context of a speaker independent digit
classification system: Send on Delta, Time to First Spike, Leaky Integrate and
Fire Neuron and Bens Spiker Algorithm. We first show that all encoding methods
yield higher classification accuracy using significantly fewer spikes when
encoding a bio-inspired cochleagram as opposed to a traditional short-time
Fourier transform. We then show that two Send On Delta variants result in
classification results comparable with a state of the art deep convolutional
neural network baseline, while simultaneously reducing the encoded bit rate.
Finally, we show that several encoding methods result in improved performance
over the conventional deep learning baseline in certain cases, further
demonstrating the power of spike encoding algorithms in the encoding of
real-valued signals and that neuromorphic implementation has the potential to
outperform state of the art techniques.
Related papers
- Adaptive Spiking Neural Networks with Hybrid Coding [0.0]
Spi-temporal Neural Network (SNN) is a more energy-efficient and effective neural network compared to Artificial Neural Networks (ANNs)
Traditional SNNs utilize same neurons when processing input data across different time steps, limiting their ability to integrate and utilizetemporal information effectively.
This paper introduces a hybrid encoding approach that not only reduces the required time steps for training but also continues to improve the overall network performance.
arXiv Detail & Related papers (2024-08-22T13:58:35Z) - Surrogate Gradient Spiking Neural Networks as Encoders for Large
Vocabulary Continuous Speech Recognition [91.39701446828144]
We show that spiking neural networks can be trained like standard recurrent neural networks using the surrogate gradient method.
They have shown promising results on speech command recognition tasks.
In contrast to their recurrent non-spiking counterparts, they show robustness to exploding gradient problems without the need to use gates.
arXiv Detail & Related papers (2022-12-01T12:36:26Z) - NAF: Neural Attenuation Fields for Sparse-View CBCT Reconstruction [79.13750275141139]
This paper proposes a novel and fast self-supervised solution for sparse-view CBCT reconstruction.
The desired attenuation coefficients are represented as a continuous function of 3D spatial coordinates, parameterized by a fully-connected deep neural network.
A learning-based encoder entailing hash coding is adopted to help the network capture high-frequency details.
arXiv Detail & Related papers (2022-09-29T04:06:00Z) - Graph Neural Networks for Channel Decoding [71.15576353630667]
We showcase competitive decoding performance for various coding schemes, such as low-density parity-check (LDPC) and BCH codes.
The idea is to let a neural network (NN) learn a generalized message passing algorithm over a given graph.
We benchmark our proposed decoder against state-of-the-art in conventional channel decoding as well as against recent deep learning-based results.
arXiv Detail & Related papers (2022-07-29T15:29:18Z) - Training Energy-Efficient Deep Spiking Neural Networks with Single-Spike
Hybrid Input Encoding [5.725845886457027]
Spiking Neural Networks (SNNs) provide higher computational efficiency in event driven neuromorphic hardware.
SNNs suffer from high inference latency, resulting from inefficient input encoding and training techniques.
This paper presents a training framework for low-latency energy-efficient SNNs.
arXiv Detail & Related papers (2021-07-26T06:16:40Z) - Quantized Neural Networks via {-1, +1} Encoding Decomposition and
Acceleration [83.84684675841167]
We propose a novel encoding scheme using -1, +1 to decompose quantized neural networks (QNNs) into multi-branch binary networks.
We validate the effectiveness of our method on large-scale image classification, object detection, and semantic segmentation tasks.
arXiv Detail & Related papers (2021-06-18T03:11:15Z) - Learning to Time-Decode in Spiking Neural Networks Through the
Information Bottleneck [37.376989855065545]
One of the key challenges in training Spiking Neural Networks (SNNs) is that target outputs typically come in the form of natural signals.
This is done by handcrafting target spiking signals, which in turn implicitly fixes the mechanisms used to decode spikes into natural signals.
This work introduces a hybrid variational autoencoder architecture, consisting of an encoding SNN and a decoding Artificial Neural Network.
arXiv Detail & Related papers (2021-06-02T14:14:47Z) - Learning Structures for Deep Neural Networks [99.8331363309895]
We propose to adopt the efficient coding principle, rooted in information theory and developed in computational neuroscience.
We show that sparse coding can effectively maximize the entropy of the output signals.
Our experiments on a public image classification dataset demonstrate that using the structure learned from scratch by our proposed algorithm, one can achieve a classification accuracy comparable to the best expert-designed structure.
arXiv Detail & Related papers (2021-05-27T12:27:24Z) - Supervised Learning with First-to-Spike Decoding in Multilayer Spiking
Neural Networks [0.0]
We propose a new supervised learning method that can train multilayer spiking neural networks to solve classification problems.
The proposed learning rule supports multiple spikes fired by hidden neurons, and yet is stable by relying on firstspike responses generated by a deterministic output layer.
We also explore several distinct spike-based encoding strategies in order to form compact representations of input data.
arXiv Detail & Related papers (2020-08-16T15:34:48Z) - Infomax Neural Joint Source-Channel Coding via Adversarial Bit Flip [41.28049430114734]
We propose a novel regularization method called Infomax Adversarial-Bit-Flip (IABF) to improve the stability and robustness of the neural joint source-channel coding scheme.
Our IABF can achieve state-of-the-art performances on both compression and error correction benchmarks and outperform the baselines by a significant margin.
arXiv Detail & Related papers (2020-04-03T10:00:02Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.