Deep Generative model with Hierarchical Latent Factors for Time Series
Anomaly Detection
- URL: http://arxiv.org/abs/2202.07586v1
- Date: Tue, 15 Feb 2022 17:19:44 GMT
- Title: Deep Generative model with Hierarchical Latent Factors for Time Series
Anomaly Detection
- Authors: Cristian Challu and Peihong Jiang and Ying Nian Wu and Laurent Callot
- Abstract summary: This work presents DGHL, a new family of generative models for time series anomaly detection.
A top-down Convolution Network maps a novel hierarchical latent space to time series windows, exploiting temporal dynamics to encode information efficiently.
Our method outperformed current state-of-the-art models on four popular benchmark datasets.
- Score: 40.21502451136054
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multivariate time series anomaly detection has become an active area of
research in recent years, with Deep Learning models outperforming previous
approaches on benchmark datasets. Among reconstruction-based models, most
previous work has focused on Variational Autoencoders and Generative
Adversarial Networks. This work presents DGHL, a new family of generative
models for time series anomaly detection, trained by maximizing the observed
likelihood by posterior sampling and alternating back-propagation. A top-down
Convolution Network maps a novel hierarchical latent space to time series
windows, exploiting temporal dynamics to encode information efficiently.
Despite relying on posterior sampling, it is computationally more efficient
than current approaches, with up to 10x shorter training times than RNN based
models. Our method outperformed current state-of-the-art models on four popular
benchmark datasets. Finally, DGHL is robust to variable features between
entities and accurate even with large proportions of missing values, settings
with increasing relevance with the advent of IoT. We demonstrate the superior
robustness of DGHL with novel occlusion experiments in this literature. Our
code is available at https://github.com/cchallu/dghl.
Related papers
- Multivariate Time-Series Anomaly Detection based on Enhancing Graph Attention Networks with Topological Analysis [31.43159668073136]
Unsupervised anomaly detection in time series is essential in industrial applications, as it significantly reduces the need for manual intervention.
Traditional methods use Graph Neural Networks (GNNs) or Transformers to analyze spatial while RNNs to model temporal dependencies.
This paper introduces a novel temporal model built on an enhanced Graph Attention Network (GAT) for multivariate time series anomaly detection called TopoGDN.
arXiv Detail & Related papers (2024-08-23T14:06:30Z) - Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - MADS: Modulated Auto-Decoding SIREN for time series imputation [9.673093148930874]
We propose MADS, a novel auto-decoding framework for time series imputation, built upon implicit neural representations.
We evaluate our model on two real-world datasets, and show that it outperforms state-of-the-art methods for time series imputation.
arXiv Detail & Related papers (2023-07-03T09:08:47Z) - Robust Audio Anomaly Detection [10.75127981612396]
The presented approach doesn't assume the presence of labeled anomalies in the training dataset.
The temporal dynamics are modeled using recurrent layers augmented with attention mechanism.
The output of the network is an outlier robust probability density function.
arXiv Detail & Related papers (2022-02-03T17:19:42Z) - Generative time series models using Neural ODE in Variational
Autoencoders [0.0]
We implement Neural Ordinary Differential Equations in a Variational Autoencoder setting for generative time series modeling.
An object-oriented approach to the code was taken to allow for easier development and research.
arXiv Detail & Related papers (2022-01-12T14:38:11Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - TadGAN: Time Series Anomaly Detection Using Generative Adversarial
Networks [73.01104041298031]
TadGAN is an unsupervised anomaly detection approach built on Generative Adversarial Networks (GANs)
To capture the temporal correlations of time series, we use LSTM Recurrent Neural Networks as base models for Generators and Critics.
To demonstrate the performance and generalizability of our approach, we test several anomaly scoring techniques and report the best-suited one.
arXiv Detail & Related papers (2020-09-16T15:52:04Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.