Abnormality Forecasting: Time Series Anomaly Prediction via Future Context Modeling
- URL: http://arxiv.org/abs/2410.12206v1
- Date: Wed, 16 Oct 2024 04:00:00 GMT
- Title: Abnormality Forecasting: Time Series Anomaly Prediction via Future Context Modeling
- Authors: Sinong Zhao, Wenrui Wang, Hongzuo Xu, Zhaoyang Yu, Qingsong Wen, Gang Wang, xiaoguang Liu, Guansong Pang,
- Abstract summary: Identifying anomalies from time series data plays an important role in various fields such as infrastructure security, intelligent operation and maintenance, and space exploration.
Current research focuses on detecting the anomalies after they occur, which can lead to significant financial/reputation loss or infrastructure damage.
In this work we study a more practical yet very challenging problem, time series anomaly prediction, aiming at providing early warnings for abnormal events before their occurrence.
- Score: 30.87477150049186
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Identifying anomalies from time series data plays an important role in various fields such as infrastructure security, intelligent operation and maintenance, and space exploration. Current research focuses on detecting the anomalies after they occur, which can lead to significant financial/reputation loss or infrastructure damage. In this work we instead study a more practical yet very challenging problem, time series anomaly prediction, aiming at providing early warnings for abnormal events before their occurrence. To tackle this problem, we introduce a novel principled approach, namely future context modeling (FCM). Its key insight is that the future abnormal events in a target window can be accurately predicted if their preceding observation window exhibits any subtle difference to normal data. To effectively capture such differences, FCM first leverages long-term forecasting models to generate a discriminative future context based on the observation data, aiming to amplify those subtle but unusual difference. It then models a normality correlation of the observation data with the forecasting future context to complement the normality modeling of the observation data in foreseeing possible abnormality in the target window. A joint variate-time attention learning is also introduced in FCM to leverage both temporal signals and features of the time series data for more discriminative normality modeling in the aforementioned two views. Comprehensive experiments on five datasets demonstrate that FCM gains good recall rate (70\%+) on multiple datasets and significantly outperforms all baselines in F1 score. Code is available at https://github.com/mala-lab/FCM.
Related papers
- Graph-Augmented LSTM for Forecasting Sparse Anomalies in Graph-Structured Time Series [0.0]
We propose a graph-augmented time series forecasting approach that explicitly integrates the graph of relationships among time series into an LSTM forecasting model.
We evaluate the approach on two benchmark datasets - the Yahoo Webscope S5 anomaly dataset and the METR-LA traffic sensor network.
Results demonstrate that the graph-augmented model achieves significantly higher precision and recall, improving F1-score by up to 10% over the best baseline.
arXiv Detail & Related papers (2025-03-05T18:37:52Z) - Self-attention-based Diffusion Model for Time-series Imputation in Partial Blackout Scenarios [23.160007389272575]
Missing values in time series data can harm machine learning performance and introduce bias.
Previous work has tackled the imputation of missing data in random, complete blackouts and forecasting scenarios.
We introduce a two-stage imputation process using self-attention and diffusion processes to model feature and temporal correlations.
arXiv Detail & Related papers (2025-03-03T16:58:15Z) - Adapting to Length Shift: FlexiLength Network for Trajectory Prediction [53.637837706712794]
Trajectory prediction plays an important role in various applications, including autonomous driving, robotics, and scene understanding.
Existing approaches mainly focus on developing compact neural networks to increase prediction precision on public datasets, typically employing a standardized input duration.
We introduce a general and effective framework, the FlexiLength Network (FLN), to enhance the robustness of existing trajectory prediction against varying observation periods.
arXiv Detail & Related papers (2024-03-31T17:18:57Z) - FreDF: Learning to Forecast in the Frequency Domain [54.2091536822376]
Time series modeling presents unique challenges due to autocorrelation in both historical data and future sequences.<n>We propose the Frequency-enhanced Direct Forecast (FreDF) which mitigates label autocorrelation by learning to forecast in the frequency domain.
arXiv Detail & Related papers (2024-02-04T08:23:41Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Lag-Llama: Towards Foundation Models for Probabilistic Time Series
Forecasting [54.04430089029033]
We present Lag-Llama, a general-purpose foundation model for time series forecasting based on a decoder-only transformer architecture.
Lag-Llama is pretrained on a large corpus of diverse time series data from several domains, and demonstrates strong zero-shot generalization capabilities.
When fine-tuned on relatively small fractions of such previously unseen datasets, Lag-Llama achieves state-of-the-art performance.
arXiv Detail & Related papers (2023-10-12T12:29:32Z) - CARLA: Self-supervised Contrastive Representation Learning for Time Series Anomaly Detection [53.83593870825628]
One main challenge in time series anomaly detection (TSAD) is the lack of labelled data in many real-life scenarios.
Most of the existing anomaly detection methods focus on learning the normal behaviour of unlabelled time series in an unsupervised manner.
We introduce a novel end-to-end self-supervised ContrAstive Representation Learning approach for time series anomaly detection.
arXiv Detail & Related papers (2023-08-18T04:45:56Z) - DEGAN: Time Series Anomaly Detection using Generative Adversarial
Network Discriminators and Density Estimation [0.0]
We have proposed an unsupervised Generative Adversarial Network (GAN)-based anomaly detection framework, DEGAN.
It relies solely on normal time series data as input to train a well-configured discriminator (D) into a standalone anomaly predictor.
arXiv Detail & Related papers (2022-10-05T04:32:12Z) - Koopman-theoretic Approach for Identification of Exogenous Anomalies in
Nonstationary Time-series Data [3.050919759387984]
We build a general method for classifying anomalies in multi-dimensional time-series data.
We demonstrate our proposed method on the important real-world task of global atmospheric pollution monitoring.
The system successfully detects localized anomalies in air quality due to events such as COVID-19 lockdowns and wildfires.
arXiv Detail & Related papers (2022-09-18T17:59:04Z) - MAD: Self-Supervised Masked Anomaly Detection Task for Multivariate Time
Series [14.236092062538653]
Masked Anomaly Detection (MAD) is a general self-supervised learning task for multivariate time series anomaly detection.
By randomly masking a portion of the inputs and training a model to estimate them, MAD is an improvement over the traditional left-to-right next step prediction (NSP) task.
Our experimental results demonstrate that MAD can achieve better anomaly detection rates over traditional NSP approaches.
arXiv Detail & Related papers (2022-05-04T14:55:42Z) - Monte Carlo EM for Deep Time Series Anomaly Detection [6.312089019297173]
Time series data are often corrupted by outliers or other kinds of anomalies.
Recent approaches to anomaly detection and forecasting assume that the proportion of anomalies in the training data is small enough to ignore.
We present a technique for augmenting existing time series models so that they explicitly account for anomalies in the training data.
arXiv Detail & Related papers (2021-12-29T07:52:36Z) - TadGAN: Time Series Anomaly Detection Using Generative Adversarial
Networks [73.01104041298031]
TadGAN is an unsupervised anomaly detection approach built on Generative Adversarial Networks (GANs)
To capture the temporal correlations of time series, we use LSTM Recurrent Neural Networks as base models for Generators and Critics.
To demonstrate the performance and generalizability of our approach, we test several anomaly scoring techniques and report the best-suited one.
arXiv Detail & Related papers (2020-09-16T15:52:04Z) - Exploring Bayesian Surprise to Prevent Overfitting and to Predict Model
Performance in Non-Intrusive Load Monitoring [25.32973996508579]
Non-Intrusive Load Monitoring (NILM) is a field of research focused on segregating constituent electrical loads in a system based only on their aggregated signal.
We quantify the degree of surprise between the predictive distribution (termed postdictive surprise) and the transitional probabilities (termed transitional surprise)
This work provides clear evidence that a point of diminishing returns of model performance with respect to dataset size exists.
arXiv Detail & Related papers (2020-09-16T15:39:08Z) - Ambiguity in Sequential Data: Predicting Uncertain Futures with
Recurrent Models [110.82452096672182]
We propose an extension of the Multiple Hypothesis Prediction (MHP) model to handle ambiguous predictions with sequential data.
We also introduce a novel metric for ambiguous problems, which is better suited to account for uncertainties.
arXiv Detail & Related papers (2020-03-10T09:15:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.