Self-Adaptive Forecasting for Improved Deep Learning on Non-Stationary
Time-Series
- URL: http://arxiv.org/abs/2202.02403v1
- Date: Fri, 4 Feb 2022 21:54:10 GMT
- Title: Self-Adaptive Forecasting for Improved Deep Learning on Non-Stationary
Time-Series
- Authors: Sercan O. Arik, Nathanael C. Yoder and Tomas Pfister
- Abstract summary: SAF integrates a self-adaptation stage prior to forecasting based on backcasting'
Our method enables efficient adaptation of encoded representations to evolving distributions, leading to superior generalization.
On synthetic and real-world datasets in domains where time-series data are known to be notoriously non-stationary, such as healthcare and finance, we demonstrate a significant benefit of SAF.
- Score: 20.958959332978726
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Real-world time-series datasets often violate the assumptions of standard
supervised learning for forecasting -- their distributions evolve over time,
rendering the conventional training and model selection procedures suboptimal.
In this paper, we propose a novel method, Self-Adaptive Forecasting (SAF), to
modify the training of time-series forecasting models to improve their
performance on forecasting tasks with such non-stationary time-series data. SAF
integrates a self-adaptation stage prior to forecasting based on `backcasting',
i.e. predicting masked inputs backward in time. This is a form of test-time
training that creates a self-supervised learning problem on test samples before
performing the prediction task. In this way, our method enables efficient
adaptation of encoded representations to evolving distributions, leading to
superior generalization. SAF can be integrated with any canonical
encoder-decoder based time-series architecture such as recurrent neural
networks or attention-based architectures. On synthetic and real-world datasets
in domains where time-series data are known to be notoriously non-stationary,
such as healthcare and finance, we demonstrate a significant benefit of SAF in
improving forecasting accuracy.
Related papers
- Tackling Data Heterogeneity in Federated Time Series Forecasting [61.021413959988216]
Time series forecasting plays a critical role in various real-world applications, including energy consumption prediction, disease transmission monitoring, and weather forecasting.
Most existing methods rely on a centralized training paradigm, where large amounts of data are collected from distributed devices to a central cloud server.
We propose a novel framework, Fed-TREND, to address data heterogeneity by generating informative synthetic data as auxiliary knowledge carriers.
arXiv Detail & Related papers (2024-11-24T04:56:45Z) - Future-Guided Learning: A Predictive Approach To Enhance Time-Series Forecasting [4.866362841501992]
We introduce Future-Guided Learning, an approach that enhances time-series event forecasting.
Our approach involves two models: a detection model that analyzes future data to identify critical events and a forecasting model that predicts these events based on present data.
When discrepancies arise between the forecasting and detection models, the forecasting model undergoes more substantial updates.
arXiv Detail & Related papers (2024-10-19T21:22:55Z) - Learning Augmentation Policies from A Model Zoo for Time Series Forecasting [58.66211334969299]
We introduce AutoTSAug, a learnable data augmentation method based on reinforcement learning.
By augmenting the marginal samples with a learnable policy, AutoTSAug substantially improves forecasting performance.
arXiv Detail & Related papers (2024-09-10T07:34:19Z) - Stock Volume Forecasting with Advanced Information by Conditional Variational Auto-Encoder [49.97673761305336]
We demonstrate the use of Conditional Variational (CVAE) to improve the forecasts of daily stock volume time series in both short and long term forecasting tasks.
CVAE generates non-linear time series as out-of-sample forecasts, which have better accuracy and closer fit of correlation to the actual data.
arXiv Detail & Related papers (2024-06-19T13:13:06Z) - Probing the Robustness of Time-series Forecasting Models with
CounterfacTS [1.823020744088554]
We present and publicly release CounterfacTS, a tool to probe the robustness of deep learning models in time-series forecasting tasks.
CounterfacTS has a user-friendly interface that allows the user to visualize, compare and quantify time series data and their forecasts.
arXiv Detail & Related papers (2024-03-06T07:34:47Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Performative Time-Series Forecasting [71.18553214204978]
We formalize performative time-series forecasting (PeTS) from a machine-learning perspective.
We propose a novel approach, Feature Performative-Shifting (FPS), which leverages the concept of delayed response to anticipate distribution shifts.
We conduct comprehensive experiments using multiple time-series models on COVID-19 and traffic forecasting tasks.
arXiv Detail & Related papers (2023-10-09T18:34:29Z) - Prompting-based Temporal Domain Generalization [10.377683220196873]
This paper presents a novel prompting-based approach to temporal domain generalization.
Our method adapts a trained model to temporal drift by learning global prompts, domain-specific prompts, and drift-aware prompts.
Experiments on classification, regression, and time series forecasting tasks demonstrate the generality of the proposed approach.
arXiv Detail & Related papers (2023-10-03T22:40:56Z) - Improving Adaptive Conformal Prediction Using Self-Supervised Learning [72.2614468437919]
We train an auxiliary model with a self-supervised pretext task on top of an existing predictive model and use the self-supervised error as an additional feature to estimate nonconformity scores.
We empirically demonstrate the benefit of the additional information using both synthetic and real data on the efficiency (width), deficit, and excess of conformal prediction intervals.
arXiv Detail & Related papers (2023-02-23T18:57:14Z) - Meta-Forecasting by combining Global DeepRepresentations with Local
Adaptation [12.747008878068314]
We introduce a novel forecasting method called Meta Global-Local Auto-Regression (Meta-GLAR)
It adapts to each time series by learning in closed-form the mapping from the representations produced by a recurrent neural network (RNN) to one-step-ahead forecasts.
Our method is competitive with the state-of-the-art in out-of-sample forecasting accuracy reported in earlier work.
arXiv Detail & Related papers (2021-11-05T11:45:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.