Anomaly Detection And Classification In Time Series With Kervolutional
Neural Networks
- URL: http://arxiv.org/abs/2005.07078v1
- Date: Thu, 14 May 2020 15:45:11 GMT
- Title: Anomaly Detection And Classification In Time Series With Kervolutional
Neural Networks
- Authors: Oliver Ammann, Gabriel Michau, Olga Fink
- Abstract summary: In this paper, we explore the potential of kervolutional neural networks applied to time series data.
We demonstrate that using a mixture of convolutional and kervolutional layers improves the model performance.
We propose a residual-based anomaly detection approach using a temporal auto-encoder.
- Score: 1.3535770763481902
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, with the development of deep learning, end-to-end neural network
architectures have been increasingly applied to condition monitoring signals.
They have demonstrated superior performance for fault detection and
classification, in particular using convolutional neural networks. Even more
recently, an extension of the concept of convolution to the concept of
kervolution has been proposed with some promising results in image
classification tasks. In this paper, we explore the potential of kervolutional
neural networks applied to time series data. We demonstrate that using a
mixture of convolutional and kervolutional layers improves the model
performance. The mixed model is first applied to a classification task in time
series, as a benchmark dataset. Subsequently, the proposed mixed architecture
is used to detect anomalies in time series data recorded by accelerometers on
helicopters. We propose a residual-based anomaly detection approach using a
temporal auto-encoder. We demonstrate that mixing kervolutional with
convolutional layers in the encoder is more sensitive to variations in the
input data and is able to detect anomalous time series in a better way.
Related papers
- How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - Deep Learning Architectures for FSCV, a Comparison [0.0]
Suitability is determined by the predictive performance in the "out-of-probe" case, the response to artificially induced electrical noise, and the ability to predict when the model will be errant for a given probe.
The InceptionTime architecture, a deep convolutional neural network, has the best absolute predictive performance of the models tested but was more susceptible to noise.
A naive multilayer perceptron architecture had the second lowest prediction error and was less affected by the artificial noise, suggesting that convolutions may not be as important for this task as one might suspect.
arXiv Detail & Related papers (2022-12-05T00:20:10Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - Stacked Residuals of Dynamic Layers for Time Series Anomaly Detection [0.0]
We present an end-to-end differentiable neural network architecture to perform anomaly detection in multivariate time series.
The architecture is a cascade of dynamical systems designed to separate linearly predictable components of the signal.
The anomaly detector exploits the temporal structure of the prediction residuals to detect both isolated point anomalies and set-point changes.
arXiv Detail & Related papers (2022-02-25T01:50:22Z) - Deep Generative model with Hierarchical Latent Factors for Time Series
Anomaly Detection [40.21502451136054]
This work presents DGHL, a new family of generative models for time series anomaly detection.
A top-down Convolution Network maps a novel hierarchical latent space to time series windows, exploiting temporal dynamics to encode information efficiently.
Our method outperformed current state-of-the-art models on four popular benchmark datasets.
arXiv Detail & Related papers (2022-02-15T17:19:44Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z) - SignalNet: A Low Resolution Sinusoid Decomposition and Estimation
Network [79.04274563889548]
We propose SignalNet, a neural network architecture that detects the number of sinusoids and estimates their parameters from quantized in-phase and quadrature samples.
We introduce a worst-case learning threshold for comparing the results of our network relative to the underlying data distributions.
In simulation, we find that our algorithm is always able to surpass the threshold for three-bit data but often cannot exceed the threshold for one-bit data.
arXiv Detail & Related papers (2021-06-10T04:21:20Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Dynamic Bayesian Neural Networks [2.28438857884398]
We define an evolving in time neural network called a Hidden Markov neural network.
Weights of a feed-forward neural network are modelled with the hidden states of a Hidden Markov model.
A filtering algorithm is used to learn a variational approximation to the evolving in time posterior over the weights.
arXiv Detail & Related papers (2020-04-15T09:18:18Z) - RobustTAD: Robust Time Series Anomaly Detection via Decomposition and
Convolutional Neural Networks [37.16594704493679]
We propose RobustTAD, a Robust Time series Anomaly Detection framework.
It integrates robust seasonal-trend decomposition and convolutional neural network for time series data.
It is deployed as a public online service and widely adopted in different business scenarios at Alibaba Group.
arXiv Detail & Related papers (2020-02-21T20:43:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.