MGADN: A Multi-task Graph Anomaly Detection Network for Multivariate
Time Series
- URL: http://arxiv.org/abs/2211.12141v1
- Date: Tue, 22 Nov 2022 10:17:42 GMT
- Title: MGADN: A Multi-task Graph Anomaly Detection Network for Multivariate
Time Series
- Authors: Weixuan Xiong, Xiaochen Sun
- Abstract summary: Anomaly detection of time series, especially multivariate time series(time series with multiple sensors), has been focused on for several years.
Existing method including neural network only concentrate on the relationship in terms of timestamp.
Our approach uses GAT, which is originated from graph neural network, to obtain connection between sensors.
Our approach is also designed to be double headed to calculate both prediction loss and reconstruction loss via VAE(Variational Auto-Encoder).
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Anomaly detection of time series, especially multivariate time series(time
series with multiple sensors), has been focused on for several years. Though
existing method has achieved great progress, there are several challenging
problems to be solved. Firstly, existing method including neural network only
concentrate on the relationship in terms of timestamp. To be exact, they only
want to know how does the data in the past influence which in the future.
However, one sensor sometimes intervenes in other sensor such as the speed of
wind may cause decrease of temperature. Secondly, there exist two categories of
model for time series anomaly detection: prediction model and reconstruction
model. Prediction model is adept at learning timely representation while short
of capability when faced with sparse anomaly. Conversely, reconstruction model
is opposite. Therefore, how can we efficiently get the relationship both in
terms of both timestamp and sensors becomes our main topic. Our approach uses
GAT, which is originated from graph neural network, to obtain connection
between sensors. And LSTM is used to obtain relationships timely. Our approach
is also designed to be double headed to calculate both prediction loss and
reconstruction loss via VAE(Variational Auto-Encoder). In order to take
advantage of two sorts of model, multi-task optimization algorithm is used in
this model.
Related papers
- Multivariate Time-Series Anomaly Detection based on Enhancing Graph Attention Networks with Topological Analysis [31.43159668073136]
Unsupervised anomaly detection in time series is essential in industrial applications, as it significantly reduces the need for manual intervention.
Traditional methods use Graph Neural Networks (GNNs) or Transformers to analyze spatial while RNNs to model temporal dependencies.
This paper introduces a novel temporal model built on an enhanced Graph Attention Network (GAT) for multivariate time series anomaly detection called TopoGDN.
arXiv Detail & Related papers (2024-08-23T14:06:30Z) - Deciphering Movement: Unified Trajectory Generation Model for Multi-Agent [53.637837706712794]
We propose a Unified Trajectory Generation model, UniTraj, that processes arbitrary trajectories as masked inputs.
Specifically, we introduce a Ghost Spatial Masking (GSM) module embedded within a Transformer encoder for spatial feature extraction.
We benchmark three practical sports game datasets, Basketball-U, Football-U, and Soccer-U, for evaluation.
arXiv Detail & Related papers (2024-05-27T22:15:23Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Networked Time Series Imputation via Position-aware Graph Enhanced
Variational Autoencoders [31.953958053709805]
We design a new model named PoGeVon which leverages variational autoencoder (VAE) to predict missing values over both node time series features and graph structures.
Experiment results demonstrate the effectiveness of our model over baselines.
arXiv Detail & Related papers (2023-05-29T21:11:34Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - Approximating DTW with a convolutional neural network on EEG data [9.409281517596396]
We propose a fast and differentiable approximation of Dynamic Time Wrapping (DTW)
We show that our methods achieve at least the same level of accuracy as other DTW main approximations with higher computational efficiency.
arXiv Detail & Related papers (2023-01-30T13:27:47Z) - AER: Auto-Encoder with Regression for Time Series Anomaly Detection [12.418290128163882]
Anomaly detection on time series data is increasingly common across various industrial domains.
Recent unsupervised machine learning methods have made remarkable progress in tackling this problem.
We propose AER (Auto-encoder with Regression), a joint model that combines a vanilla auto-encoder and an LSTM regressor.
arXiv Detail & Related papers (2022-12-27T17:22:21Z) - DynImp: Dynamic Imputation for Wearable Sensing Data Through Sensory and
Temporal Relatedness [78.98998551326812]
We argue that traditional methods have rarely made use of both times-series dynamics of the data as well as the relatedness of the features from different sensors.
We propose a model, termed as DynImp, to handle different time point's missingness with nearest neighbors along feature axis.
We show that the method can exploit the multi-modality features from related sensors and also learn from history time-series dynamics to reconstruct the data under extreme missingness.
arXiv Detail & Related papers (2022-09-26T21:59:14Z) - STING: Self-attention based Time-series Imputation Networks using GAN [4.052758394413726]
STING (Self-attention based Time-series Imputation Networks using GAN) is proposed.
We take advantage of generative adversarial networks and bidirectional recurrent neural networks to learn latent representations of the time series.
Experimental results on three real-world datasets demonstrate that STING outperforms the existing state-of-the-art methods in terms of imputation accuracy.
arXiv Detail & Related papers (2022-09-22T06:06:56Z) - Multi-scale Anomaly Detection for Big Time Series of Industrial Sensors [50.6434162489902]
We propose a reconstruction-based anomaly detection method, MissGAN, iteratively learning to decode and encode naturally smooth time series.
MissGAN does not need labels or only needs labels of normal instances, making it widely applicable.
arXiv Detail & Related papers (2022-04-18T04:34:15Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.