Probing the Robustness of Time-series Forecasting Models with
CounterfacTS
- URL: http://arxiv.org/abs/2403.03508v1
- Date: Wed, 6 Mar 2024 07:34:47 GMT
- Title: Probing the Robustness of Time-series Forecasting Models with
CounterfacTS
- Authors: H{\aa}kon Hanisch Kj{\ae}rnli, Lluis Mas-Ribas, Aida Ashrafi, Gleb
Sizov, Helge Langseth and Odd Erik Gundersen
- Abstract summary: We present and publicly release CounterfacTS, a tool to probe the robustness of deep learning models in time-series forecasting tasks.
CounterfacTS has a user-friendly interface that allows the user to visualize, compare and quantify time series data and their forecasts.
- Score: 1.823020744088554
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A common issue for machine learning models applied to time-series forecasting
is the temporal evolution of the data distributions (i.e., concept drift).
Because most of the training data does not reflect such changes, the models
present poor performance on the new out-of-distribution scenarios and,
therefore, the impact of such events cannot be reliably anticipated ahead of
time. We present and publicly release CounterfacTS, a tool to probe the
robustness of deep learning models in time-series forecasting tasks via
counterfactuals. CounterfacTS has a user-friendly interface that allows the
user to visualize, compare and quantify time series data and their forecasts,
for a number of datasets and deep learning models. Furthermore, the user can
apply various transformations to the time series and explore the resulting
changes in the forecasts in an interpretable manner. Through example cases, we
illustrate how CounterfacTS can be used to i) identify the main features
characterizing and differentiating sets of time series, ii) assess how the
model performance depends on these characateristics, and iii) guide
transformations of the original time series to create counterfactuals with
desired properties for training and increasing the forecasting performance in
new regions of the data distribution. We discuss the importance of visualizing
and considering the location of the data in a projected feature space to
transform time-series and create effective counterfactuals for training the
models. Overall, CounterfacTS aids at creating counterfactuals to efficiently
explore the impact of hypothetical scenarios not covered by the original data
in time-series forecasting tasks.
Related papers
- TimeInf: Time Series Data Contribution via Influence Functions [8.018453062120916]
TimeInf is a data contribution estimation method for time-series datasets.
Our empirical results demonstrate that TimeInf outperforms state-of-the-art methods in identifying harmful anomalies.
TimeInf offers intuitive and interpretable attributions of data values, allowing us to easily distinguish diverse anomaly patterns through visualizations.
arXiv Detail & Related papers (2024-07-21T19:10:40Z) - Scaling Law for Time Series Forecasting [8.967263259533036]
Scaling law that rewards large datasets, complex models and enhanced data granularity has been observed in various fields of deep learning.
Yet, studies on time series forecasting have cast doubt on scaling behaviors of deep learning methods for time series forecasting.
We propose a theory for scaling law for time series forecasting that can explain these seemingly abnormal behaviors.
arXiv Detail & Related papers (2024-05-24T00:46:27Z) - TimeGPT in Load Forecasting: A Large Time Series Model Perspective [38.92798207166188]
Machine learning models have made significant progress in load forecasting, but their forecast accuracy is limited in cases where historical load data is scarce.
This paper aims to discuss the potential of large time series models in load forecasting with scarce historical data.
arXiv Detail & Related papers (2024-04-07T09:05:09Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Timer: Generative Pre-trained Transformers Are Large Time Series Models [83.03091523806668]
This paper aims at the early development of large time series models (LTSM)
During pre-training, we curate large-scale datasets with up to 1 billion time points.
To meet diverse application needs, we convert forecasting, imputation, and anomaly detection of time series into a unified generative task.
arXiv Detail & Related papers (2024-02-04T06:55:55Z) - Performative Time-Series Forecasting [71.18553214204978]
We formalize performative time-series forecasting (PeTS) from a machine-learning perspective.
We propose a novel approach, Feature Performative-Shifting (FPS), which leverages the concept of delayed response to anticipate distribution shifts.
We conduct comprehensive experiments using multiple time-series models on COVID-19 and traffic forecasting tasks.
arXiv Detail & Related papers (2023-10-09T18:34:29Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Self-Adaptive Forecasting for Improved Deep Learning on Non-Stationary
Time-Series [20.958959332978726]
SAF integrates a self-adaptation stage prior to forecasting based on backcasting'
Our method enables efficient adaptation of encoded representations to evolving distributions, leading to superior generalization.
On synthetic and real-world datasets in domains where time-series data are known to be notoriously non-stationary, such as healthcare and finance, we demonstrate a significant benefit of SAF.
arXiv Detail & Related papers (2022-02-04T21:54:10Z) - Learning Non-Stationary Time-Series with Dynamic Pattern Extractions [16.19692047595777]
State-of-the-art algorithms have achieved a decent performance in dealing with stationary temporal data.
Traditional algorithms that tackle stationary time-series do not apply to non-stationary series like Forex trading.
This paper investigates applicable models that can improve the accuracy of forecasting future trends of non-stationary time-series sequences.
arXiv Detail & Related papers (2021-11-20T10:52:37Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.