RobustTSF: Towards Theory and Design of Robust Time Series Forecasting
with Anomalies
- URL: http://arxiv.org/abs/2402.02032v1
- Date: Sat, 3 Feb 2024 05:13:09 GMT
- Title: RobustTSF: Towards Theory and Design of Robust Time Series Forecasting
with Anomalies
- Authors: Hao Cheng, Qingsong Wen, Yang Liu, Liang Sun
- Abstract summary: We develop methods to automatically learn a robust forecasting model from contaminated data.
Based on our analyses, we propose a simple and efficient algorithm to learn a robust forecasting model.
- Score: 28.59935971037066
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series forecasting is an important and forefront task in many real-world
applications. However, most of time series forecasting techniques assume that
the training data is clean without anomalies. This assumption is unrealistic
since the collected time series data can be contaminated in practice. The
forecasting model will be inferior if it is directly trained by time series
with anomalies. Thus it is essential to develop methods to automatically learn
a robust forecasting model from the contaminated data. In this paper, we first
statistically define three types of anomalies, then theoretically and
experimentally analyze the loss robustness and sample robustness when these
anomalies exist. Based on our analyses, we propose a simple and efficient
algorithm to learn a robust forecasting model. Extensive experiments show that
our method is highly robust and outperforms all existing approaches. The code
is available at https://github.com/haochenglouis/RobustTSF.
Related papers
- Beyond Data Scarcity: A Frequency-Driven Framework for Zero-Shot Forecasting [15.431513584239047]
Time series forecasting is critical in numerous real-world applications.
Traditional forecasting techniques struggle when data is scarce or not available at all.
Recent advancements often leverage large-scale foundation models for such tasks.
arXiv Detail & Related papers (2024-11-24T07:44:39Z) - Meta-learning and Data Augmentation for Stress Testing Forecasting Models [0.33554367023486936]
A model is considered to be under stress if it shows a negative behaviour, such as higher-than-usual errors or increased uncertainty.
This paper contributes with a novel framework called MAST (Meta-learning and data Augmentation for Stress Testing)
arXiv Detail & Related papers (2024-06-24T17:59:33Z) - ForecastPFN: Synthetically-Trained Zero-Shot Forecasting [16.12148632541671]
ForecastPFN is the first zero-shot forecasting model trained purely on a novel synthetic data distribution.
We show that zero-shot predictions made by ForecastPFN are more accurate and faster compared to state-of-the-art forecasting methods.
arXiv Detail & Related papers (2023-11-03T14:17:11Z) - LARA: A Light and Anti-overfitting Retraining Approach for Unsupervised
Time Series Anomaly Detection [49.52429991848581]
We propose a Light and Anti-overfitting Retraining Approach (LARA) for deep variational auto-encoder based time series anomaly detection methods (VAEs)
This work aims to make three novel contributions: 1) the retraining process is formulated as a convex problem and can converge at a fast rate as well as prevent overfitting; 2) designing a ruminate block, which leverages the historical data without the need to store them; and 3) mathematically proving that when fine-tuning the latent vector and reconstructed data, the linear formations can achieve the least adjusting errors between the ground truths and the fine-tuned ones.
arXiv Detail & Related papers (2023-10-09T12:36:16Z) - Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - Towards Flexible Time-to-event Modeling: Optimizing Neural Networks via
Rank Regression [17.684526928033065]
We introduce the Deep AFT Rank-regression model for Time-to-event prediction (DART)
This model uses an objective function based on Gehan's rank statistic, which is efficient and reliable for representation learning.
The proposed method is a semiparametric approach to AFT modeling that does not impose any distributional assumptions on the survival time distribution.
arXiv Detail & Related papers (2023-07-16T13:58:28Z) - Learning Sample Difficulty from Pre-trained Models for Reliable
Prediction [55.77136037458667]
We propose to utilize large-scale pre-trained models to guide downstream model training with sample difficulty-aware entropy regularization.
We simultaneously improve accuracy and uncertainty calibration across challenging benchmarks.
arXiv Detail & Related papers (2023-04-20T07:29:23Z) - Leveraging Unlabeled Data to Predict Out-of-Distribution Performance [63.740181251997306]
Real-world machine learning deployments are characterized by mismatches between the source (training) and target (test) distributions.
In this work, we investigate methods for predicting the target domain accuracy using only labeled source data and unlabeled target data.
We propose Average Thresholded Confidence (ATC), a practical method that learns a threshold on the model's confidence, predicting accuracy as the fraction of unlabeled examples.
arXiv Detail & Related papers (2022-01-11T23:01:12Z) - Monte Carlo EM for Deep Time Series Anomaly Detection [6.312089019297173]
Time series data are often corrupted by outliers or other kinds of anomalies.
Recent approaches to anomaly detection and forecasting assume that the proportion of anomalies in the training data is small enough to ignore.
We present a technique for augmenting existing time series models so that they explicitly account for anomalies in the training data.
arXiv Detail & Related papers (2021-12-29T07:52:36Z) - TadGAN: Time Series Anomaly Detection Using Generative Adversarial
Networks [73.01104041298031]
TadGAN is an unsupervised anomaly detection approach built on Generative Adversarial Networks (GANs)
To capture the temporal correlations of time series, we use LSTM Recurrent Neural Networks as base models for Generators and Critics.
To demonstrate the performance and generalizability of our approach, we test several anomaly scoring techniques and report the best-suited one.
arXiv Detail & Related papers (2020-09-16T15:52:04Z) - Ambiguity in Sequential Data: Predicting Uncertain Futures with
Recurrent Models [110.82452096672182]
We propose an extension of the Multiple Hypothesis Prediction (MHP) model to handle ambiguous predictions with sequential data.
We also introduce a novel metric for ambiguous problems, which is better suited to account for uncertainties.
arXiv Detail & Related papers (2020-03-10T09:15:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.