Multivariate Time Series Classification Using Spiking Neural Networks
- URL: http://arxiv.org/abs/2007.03547v1
- Date: Tue, 7 Jul 2020 15:24:01 GMT
- Title: Multivariate Time Series Classification Using Spiking Neural Networks
- Authors: Haowen Fang, Amar Shrestha, Qinru Qiu
- Abstract summary: Spiking neural network has drawn attention as it enables low power consumption.
We present an encoding scheme to convert time series into sparse spatial temporal spike patterns.
A training algorithm to classify spatial temporal patterns is also proposed.
- Score: 7.273181759304122
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: There is an increasing demand to process streams of temporal data in
energy-limited scenarios such as embedded devices, driven by the advancement
and expansion of Internet of Things (IoT) and Cyber-Physical Systems (CPS).
Spiking neural network has drawn attention as it enables low power consumption
by encoding and processing information as sparse spike events, which can be
exploited for event-driven computation. Recent works also show SNNs' capability
to process spatial temporal information. Such advantages can be exploited by
power-limited devices to process real-time sensor data. However, most existing
SNN training algorithms focus on vision tasks and temporal credit assignment is
not addressed. Furthermore, widely adopted rate encoding ignores temporal
information, hence it's not suitable for representing time series. In this
work, we present an encoding scheme to convert time series into sparse spatial
temporal spike patterns. A training algorithm to classify spatial temporal
patterns is also proposed. Proposed approach is evaluated on multiple time
series datasets in the UCR repository and achieved performance comparable to
deep neural networks.
Related papers
- Adaptive Spiking Neural Networks with Hybrid Coding [0.0]
Spi-temporal Neural Network (SNN) is a more energy-efficient and effective neural network compared to Artificial Neural Networks (ANNs)
Traditional SNNs utilize same neurons when processing input data across different time steps, limiting their ability to integrate and utilizetemporal information effectively.
This paper introduces a hybrid encoding approach that not only reduces the required time steps for training but also continues to improve the overall network performance.
arXiv Detail & Related papers (2024-08-22T13:58:35Z) - A Comparison of Temporal Encoders for Neuromorphic Keyword Spotting with
Few Neurons [0.11726720776908518]
Two candidate neurocomputational elements for temporal encoding and feature extraction in SNNs are investigated.
Resource-efficient keyword spotting applications may benefit from the use of these encoders, but further work on methods for learning the time constants and weights is required.
arXiv Detail & Related papers (2023-01-24T12:50:54Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - NAF: Neural Attenuation Fields for Sparse-View CBCT Reconstruction [79.13750275141139]
This paper proposes a novel and fast self-supervised solution for sparse-view CBCT reconstruction.
The desired attenuation coefficients are represented as a continuous function of 3D spatial coordinates, parameterized by a fully-connected deep neural network.
A learning-based encoder entailing hash coding is adopted to help the network capture high-frequency details.
arXiv Detail & Related papers (2022-09-29T04:06:00Z) - Scalable Spatiotemporal Graph Neural Networks [14.415967477487692]
Graph neural networks (GNNs) are often the core component of the forecasting architecture.
In most pretemporal GNNs, the computational complexity scales up to a quadratic factor with the length of the sequence times the number of links in the graph.
We propose a scalable architecture that exploits an efficient encoding of both temporal and spatial dynamics.
arXiv Detail & Related papers (2022-09-14T09:47:38Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Braille Letter Reading: A Benchmark for Spatio-Temporal Pattern
Recognition on Neuromorphic Hardware [50.380319968947035]
Recent deep learning approaches have reached accuracy in such tasks, but their implementation on conventional embedded solutions is still computationally very and energy expensive.
We propose a new benchmark for computing tactile pattern recognition at the edge through letters reading.
We trained and compared feed-forward and recurrent spiking neural networks (SNNs) offline using back-propagation through time with surrogate gradients, then we deployed them on the Intel Loihimorphic chip for efficient inference.
Our results show that the LSTM outperforms the recurrent SNN in terms of accuracy by 14%. However, the recurrent SNN on Loihi is 237 times more energy
arXiv Detail & Related papers (2022-05-30T14:30:45Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z) - A Prospective Study on Sequence-Driven Temporal Sampling and Ego-Motion
Compensation for Action Recognition in the EPIC-Kitchens Dataset [68.8204255655161]
Action recognition is one of the top-challenging research fields in computer vision.
ego-motion recorded sequences have become of important relevance.
The proposed method aims to cope with it by estimating this ego-motion or camera motion.
arXiv Detail & Related papers (2020-08-26T14:44:45Z) - SRDCNN: Strongly Regularized Deep Convolution Neural Network
Architecture for Time-series Sensor Signal Classification Tasks [4.950427992960756]
We present SRDCNN: Strongly Regularized Deep Convolution Neural Network (DCNN) based deep architecture to perform time series classification tasks.
The novelty of the proposed approach is that the network weights are regularized by both L1 and L2 norm penalties.
arXiv Detail & Related papers (2020-07-14T08:42:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.