WaveSense: Efficient Temporal Convolutions with Spiking Neural Networks
for Keyword Spotting
- URL: http://arxiv.org/abs/2111.01456v1
- Date: Tue, 2 Nov 2021 09:38:22 GMT
- Title: WaveSense: Efficient Temporal Convolutions with Spiking Neural Networks
for Keyword Spotting
- Authors: Philipp Weidel, Sadique Sheik
- Abstract summary: We propose spiking neural dynamics as a natural alternative to dilated temporal convolutions.
We extend this idea to WaveSense, a spiking neural network inspired by the WaveNet architecture.
- Score: 1.0152838128195467
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Ultra-low power local signal processing is a crucial aspect for edge
applications on always-on devices. Neuromorphic processors emulating spiking
neural networks show great computational power while fulfilling the limited
power budget as needed in this domain. In this work we propose spiking neural
dynamics as a natural alternative to dilated temporal convolutions. We extend
this idea to WaveSense, a spiking neural network inspired by the WaveNet
architecture. WaveSense uses simple neural dynamics, fixed time-constants and a
simple feed-forward architecture and hence is particularly well suited for a
neuromorphic implementation. We test the capabilities of this model on several
datasets for keyword-spotting. The results show that the proposed network beats
the state of the art of other spiking neural networks and reaches near
state-of-the-art performance of artificial neural networks such as CNNs and
LSTMs.
Related papers
- Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Expressivity of Spiking Neural Networks [15.181458163440634]
We study the capabilities of spiking neural networks where information is encoded in the firing time of neurons.
In contrast to ReLU networks, we prove that spiking neural networks can realize both continuous and discontinuous functions.
arXiv Detail & Related papers (2023-08-16T08:45:53Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - SpikiLi: A Spiking Simulation of LiDAR based Real-time Object Detection
for Autonomous Driving [0.0]
Spiking Neural Networks are a new neural network design approach that promises tremendous improvements in power efficiency, computation efficiency, and processing latency.
We first illustrate the applicability of spiking neural networks to a complex deep learning task namely Lidar based 3D object detection for automated driving.
arXiv Detail & Related papers (2022-06-06T20:05:17Z) - Stochastic resonance neurons in artificial neural networks [0.0]
We propose a new type of neural networks using resonances as an inherent part of the architecture.
We show that such a neural network is more robust against the impact of noise.
arXiv Detail & Related papers (2022-05-06T18:42:36Z) - The Spectral Bias of Polynomial Neural Networks [63.27903166253743]
Polynomial neural networks (PNNs) have been shown to be particularly effective at image generation and face recognition, where high-frequency information is critical.
Previous studies have revealed that neural networks demonstrate a $textitspectral bias$ towards low-frequency functions, which yields faster learning of low-frequency components during training.
Inspired by such studies, we conduct a spectral analysis of the Tangent Kernel (NTK) of PNNs.
We find that the $Pi$-Net family, i.e., a recently proposed parametrization of PNNs, speeds up the
arXiv Detail & Related papers (2022-02-27T23:12:43Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - Deep Spiking Convolutional Neural Network for Single Object Localization
Based On Deep Continuous Local Learning [0.0]
We propose a deep convolutional spiking neural network for the localization of a single object in a grayscale image.
Results reported on Oxford-IIIT-Pet validates the exploitation of spiking neural networks with a supervised learning approach.
arXiv Detail & Related papers (2021-05-12T12:02:05Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z) - A$^3$: Accelerating Attention Mechanisms in Neural Networks with
Approximation [3.5217810503607896]
We design and architect A3, which accelerates attention mechanisms in neural networks with algorithmic approximation and hardware specialization.
Our proposed accelerator achieves multiple orders of magnitude improvement in energy efficiency (performance/watt) as well as substantial speedup over the state-of-the-art conventional hardware.
arXiv Detail & Related papers (2020-02-22T02:09:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.