A Comparison of Temporal Encoders for Neuromorphic Keyword Spotting with
Few Neurons
- URL: http://arxiv.org/abs/2301.09962v1
- Date: Tue, 24 Jan 2023 12:50:54 GMT
- Title: A Comparison of Temporal Encoders for Neuromorphic Keyword Spotting with
Few Neurons
- Authors: Mattias Nilsson, Ton Juny Pina, Lyes Khacef, Foteini Liwicki,
Elisabetta Chicca, and Fredrik Sandin
- Abstract summary: Two candidate neurocomputational elements for temporal encoding and feature extraction in SNNs are investigated.
Resource-efficient keyword spotting applications may benefit from the use of these encoders, but further work on methods for learning the time constants and weights is required.
- Score: 0.11726720776908518
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: With the expansion of AI-powered virtual assistants, there is a need for
low-power keyword spotting systems providing a "wake-up" mechanism for
subsequent computationally expensive speech recognition. One promising approach
is the use of neuromorphic sensors and spiking neural networks (SNNs)
implemented in neuromorphic processors for sparse event-driven sensing.
However, this requires resource-efficient SNN mechanisms for temporal encoding,
which need to consider that these systems process information in a streaming
manner, with physical time being an intrinsic property of their operation. In
this work, two candidate neurocomputational elements for temporal encoding and
feature extraction in SNNs described in recent literature - the spiking
time-difference encoder (TDE) and disynaptic excitatory-inhibitory (E-I)
elements - are comparatively investigated in a keyword-spotting task on
formants computed from spoken digits in the TIDIGITS dataset. While both
encoders improve performance over direct classification of the formant features
in the training data, enabling a complete binary classification with a logistic
regression model, they show no clear improvements on the test set.
Resource-efficient keyword spotting applications may benefit from the use of
these encoders, but further work on methods for learning the time constants and
weights is required to investigate their full potential.
Related papers
- Stochastic Spiking Neural Networks with First-to-Spike Coding [7.955633422160267]
Spiking Neural Networks (SNNs) are known for their bio-plausibility and energy efficiency.
In this work, we explore the merger of novel computing and information encoding schemes in SNN architectures.
We investigate the tradeoffs of our proposal in terms of accuracy, inference latency, spiking sparsity, energy consumption, and datasets.
arXiv Detail & Related papers (2024-04-26T22:52:23Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Surrogate Gradient Spiking Neural Networks as Encoders for Large
Vocabulary Continuous Speech Recognition [91.39701446828144]
We show that spiking neural networks can be trained like standard recurrent neural networks using the surrogate gradient method.
They have shown promising results on speech command recognition tasks.
In contrast to their recurrent non-spiking counterparts, they show robustness to exploding gradient problems without the need to use gates.
arXiv Detail & Related papers (2022-12-01T12:36:26Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Braille Letter Reading: A Benchmark for Spatio-Temporal Pattern
Recognition on Neuromorphic Hardware [50.380319968947035]
Recent deep learning approaches have reached accuracy in such tasks, but their implementation on conventional embedded solutions is still computationally very and energy expensive.
We propose a new benchmark for computing tactile pattern recognition at the edge through letters reading.
We trained and compared feed-forward and recurrent spiking neural networks (SNNs) offline using back-propagation through time with surrogate gradients, then we deployed them on the Intel Loihimorphic chip for efficient inference.
Our results show that the LSTM outperforms the recurrent SNN in terms of accuracy by 14%. However, the recurrent SNN on Loihi is 237 times more energy
arXiv Detail & Related papers (2022-05-30T14:30:45Z) - Brain inspired neuronal silencing mechanism to enable reliable sequence
identification [0.0]
We present an experimental mechanism for high-precision feedforward sequence identification networks (ID-nets) without feedback loops.
This mechanism temporarily silences neurons following their recent spiking activity.
The presented mechanism opens new horizons for advanced ANN algorithms.
arXiv Detail & Related papers (2022-03-24T12:15:02Z) - A Deep 2-Dimensional Dynamical Spiking Neuronal Network for Temporal
Encoding trained with STDP [10.982390333064536]
We show that a large, deep layered SNN with dynamical, chaotic activity mimicking the mammalian cortex is capable of encoding information from temporal data.
We argue that the randomness inherent in the network weights allow the neurons to form groups that encode the temporal data being inputted after self-organizing with STDP.
We analyze the network in terms of network entropy as a metric of information transfer.
arXiv Detail & Related papers (2020-09-01T17:12:18Z) - Multivariate Time Series Classification Using Spiking Neural Networks [7.273181759304122]
Spiking neural network has drawn attention as it enables low power consumption.
We present an encoding scheme to convert time series into sparse spatial temporal spike patterns.
A training algorithm to classify spatial temporal patterns is also proposed.
arXiv Detail & Related papers (2020-07-07T15:24:01Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z) - Exploiting Neuron and Synapse Filter Dynamics in Spatial Temporal
Learning of Deep Spiking Neural Network [7.503685643036081]
A bio-plausible SNN model with spatial-temporal property is a complex dynamic system.
We formulate SNN as a network of infinite impulse response (IIR) filters with neuron nonlinearity.
We propose a training algorithm that is capable to learn spatial-temporal patterns by searching for the optimal synapse filter kernels and weights.
arXiv Detail & Related papers (2020-02-19T01:27:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.