Self-Adaptive Forecasting for Improved Deep Learning on Non-Stationary
Time-Series
- URL: http://arxiv.org/abs/2202.02403v1
- Date: Fri, 4 Feb 2022 21:54:10 GMT
- Title: Self-Adaptive Forecasting for Improved Deep Learning on Non-Stationary
Time-Series
- Authors: Sercan O. Arik, Nathanael C. Yoder and Tomas Pfister
- Abstract summary: SAF integrates a self-adaptation stage prior to forecasting based on backcasting'
Our method enables efficient adaptation of encoded representations to evolving distributions, leading to superior generalization.
On synthetic and real-world datasets in domains where time-series data are known to be notoriously non-stationary, such as healthcare and finance, we demonstrate a significant benefit of SAF.
- Score: 20.958959332978726
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Real-world time-series datasets often violate the assumptions of standard
supervised learning for forecasting -- their distributions evolve over time,
rendering the conventional training and model selection procedures suboptimal.
In this paper, we propose a novel method, Self-Adaptive Forecasting (SAF), to
modify the training of time-series forecasting models to improve their
performance on forecasting tasks with such non-stationary time-series data. SAF
integrates a self-adaptation stage prior to forecasting based on `backcasting',
i.e. predicting masked inputs backward in time. This is a form of test-time
training that creates a self-supervised learning problem on test samples before
performing the prediction task. In this way, our method enables efficient
adaptation of encoded representations to evolving distributions, leading to
superior generalization. SAF can be integrated with any canonical
encoder-decoder based time-series architecture such as recurrent neural
networks or attention-based architectures. On synthetic and real-world datasets
in domains where time-series data are known to be notoriously non-stationary,
such as healthcare and finance, we demonstrate a significant benefit of SAF in
improving forecasting accuracy.
Related papers
- Battling the Non-stationarity in Time Series Forecasting via Test-time Adaptation [39.7344214193566]
We introduce a pioneering test-time adaptation framework tailored for time series forecasting (TSF)
TAFAS, the proposed approach to TSF-TTA, flexibly adapts source forecasters to continuously shifting test distributions while preserving the core semantic information learned during pre-training.
The novel utilization of partially-observed ground truth and gated calibration module enables proactive, robust, and model-agnostic adaptation of source forecasters.
arXiv Detail & Related papers (2025-01-09T04:59:15Z) - Neural Conformal Control for Time Series Forecasting [54.96087475179419]
We introduce a neural network conformal prediction method for time series that enhances adaptivity in non-stationary environments.
Our approach acts as a neural controller designed to achieve desired target coverage, leveraging auxiliary multi-view data with neural network encoders.
We empirically demonstrate significant improvements in coverage and probabilistic accuracy, and find that our method is the only one that combines good calibration with consistency in prediction intervals.
arXiv Detail & Related papers (2024-12-24T03:56:25Z) - Enhancing Foundation Models for Time Series Forecasting via Wavelet-based Tokenization [74.3339999119713]
We develop a wavelet-based tokenizer that allows models to learn complex representations directly in the space of time-localized frequencies.
Our method first scales and decomposes the input time series, then thresholds and quantizes the wavelet coefficients, and finally pre-trains an autoregressive model to forecast coefficients for the forecast horizon.
arXiv Detail & Related papers (2024-12-06T18:22:59Z) - Tackling Data Heterogeneity in Federated Time Series Forecasting [61.021413959988216]
Time series forecasting plays a critical role in various real-world applications, including energy consumption prediction, disease transmission monitoring, and weather forecasting.
Most existing methods rely on a centralized training paradigm, where large amounts of data are collected from distributed devices to a central cloud server.
We propose a novel framework, Fed-TREND, to address data heterogeneity by generating informative synthetic data as auxiliary knowledge carriers.
arXiv Detail & Related papers (2024-11-24T04:56:45Z) - Future-Guided Learning: A Predictive Approach To Enhance Time-Series Forecasting [4.866362841501992]
We introduce Future-Guided Learning, an approach that enhances time-series event forecasting through a dynamic feedback mechanism inspired by predictive coding.
Our method involves two models: a detection model that analyzes future data to identify critical events and a forecasting model that predicts these events based on current data.
We validate our approach on a variety of tasks, demonstrating a 44.8% increase in AUC-ROC for seizure prediction using EEG data, and a 48.7% reduction in MSE for forecasting in nonlinear dynamical systems.
arXiv Detail & Related papers (2024-10-19T21:22:55Z) - Stock Volume Forecasting with Advanced Information by Conditional Variational Auto-Encoder [49.97673761305336]
We demonstrate the use of Conditional Variational (CVAE) to improve the forecasts of daily stock volume time series in both short and long term forecasting tasks.
CVAE generates non-linear time series as out-of-sample forecasts, which have better accuracy and closer fit of correlation to the actual data.
arXiv Detail & Related papers (2024-06-19T13:13:06Z) - Probing the Robustness of Time-series Forecasting Models with
CounterfacTS [1.823020744088554]
We present and publicly release CounterfacTS, a tool to probe the robustness of deep learning models in time-series forecasting tasks.
CounterfacTS has a user-friendly interface that allows the user to visualize, compare and quantify time series data and their forecasts.
arXiv Detail & Related papers (2024-03-06T07:34:47Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Improving Adaptive Conformal Prediction Using Self-Supervised Learning [72.2614468437919]
We train an auxiliary model with a self-supervised pretext task on top of an existing predictive model and use the self-supervised error as an additional feature to estimate nonconformity scores.
We empirically demonstrate the benefit of the additional information using both synthetic and real data on the efficiency (width), deficit, and excess of conformal prediction intervals.
arXiv Detail & Related papers (2023-02-23T18:57:14Z) - Meta-Forecasting by combining Global DeepRepresentations with Local
Adaptation [12.747008878068314]
We introduce a novel forecasting method called Meta Global-Local Auto-Regression (Meta-GLAR)
It adapts to each time series by learning in closed-form the mapping from the representations produced by a recurrent neural network (RNN) to one-step-ahead forecasts.
Our method is competitive with the state-of-the-art in out-of-sample forecasting accuracy reported in earlier work.
arXiv Detail & Related papers (2021-11-05T11:45:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.