On Forecast Stability
- URL: http://arxiv.org/abs/2310.17332v1
- Date: Thu, 26 Oct 2023 11:55:30 GMT
- Title: On Forecast Stability
- Authors: Rakshitha Godahewa, Christoph Bergmeir, Zeynep Erkin Baz, Chengjun
Zhu, Zhangdi Song, Salvador Garc\'ia, Dario Benavides
- Abstract summary: We explore two types of forecast stability that we call vertical stability and horizontal stability.
We propose a simple linear-interpolation-based approach that is applicable to stabilise the forecasts provided by any base model vertically and horizontally.
- Score: 3.5789787644375495
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Forecasts are typically not produced in a vacuum but in a business context,
where forecasts are generated on a regular basis and interact with each other.
For decisions, it may be important that forecasts do not change arbitrarily,
and are stable in some sense. However, this area has received only limited
attention in the forecasting literature. In this paper, we explore two types of
forecast stability that we call vertical stability and horizontal stability.
The existing works in the literature are only applicable to certain base models
and extending these frameworks to be compatible with any base model is not
straightforward. Furthermore, these frameworks can only stabilise the forecasts
vertically. To fill this gap, we propose a simple linear-interpolation-based
approach that is applicable to stabilise the forecasts provided by any base
model vertically and horizontally. The approach can produce both accurate and
stable forecasts. Using N-BEATS, Pooled Regression and LightGBM as the base
models, in our evaluation on four publicly available datasets, the proposed
framework is able to achieve significantly higher stability and/or accuracy
compared to a set of benchmarks including a state-of-the-art forecast
stabilisation method across three error metrics and six stability metrics.
Related papers
- Conditionally valid Probabilistic Conformal Prediction [57.80927226809277]
We develop a new method for creating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution.
We demonstrate the effectiveness of our approach through extensive simulations, showing that it outperforms existing methods in terms of conditional coverage.
arXiv Detail & Related papers (2024-07-01T20:44:48Z) - Stability Evaluation via Distributional Perturbation Analysis [28.379994938809133]
We propose a stability evaluation criterion based on distributional perturbations.
Our stability evaluation criterion can address both emphdata corruptions and emphsub-population shifts.
Empirically, we validate the practical utility of our stability evaluation criterion across a host of real-world applications.
arXiv Detail & Related papers (2024-05-06T06:47:14Z) - Stable Update of Regression Trees [0.0]
We focus on the stability of an inherently explainable machine learning method, namely regression trees.
We propose a regularization method, where data points are weighted based on the uncertainty in the initial model.
Results show that the proposed update method improves stability while achieving similar or better predictive performance.
arXiv Detail & Related papers (2024-02-21T09:41:56Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2023-10-17T20:30:16Z) - Robust Linear Regression: Phase-Transitions and Precise Tradeoffs for
General Norms [29.936005822346054]
We investigate the impact of test-time adversarial attacks on linear regression models.
We determine the optimal level of robustness that any model can reach while maintaining a given level of standard predictive performance (accuracy)
We obtain a precise characterization which distinguishes between regimes where robustness is achievable without hurting standard accuracy and regimes where a tradeoff might be unavoidable.
arXiv Detail & Related papers (2023-08-01T13:55:45Z) - Causality-oriented robustness: exploiting general additive interventions [3.871660145364189]
In this paper, we focus on causality-oriented robustness and propose Distributional Robustness via Invariant Gradients (DRIG)
In a linear setting, we prove that DRIG yields predictions that are robust among a data-dependent class of distribution shifts.
We extend our approach to the semi-supervised domain adaptation setting to further improve prediction performance.
arXiv Detail & Related papers (2023-07-18T16:22:50Z) - Toward Reliable Human Pose Forecasting with Uncertainty [51.628234388046195]
We develop an open-source library for human pose forecasting, including multiple models, supporting several datasets.
We devise two types of uncertainty in the problem to increase performance and convey better trust.
arXiv Detail & Related papers (2023-04-13T17:56:08Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2022-06-16T06:13:53Z) - Learning Probabilistic Ordinal Embeddings for Uncertainty-Aware
Regression [91.3373131262391]
Uncertainty is the only certainty there is.
Traditionally, the direct regression formulation is considered and the uncertainty is modeled by modifying the output space to a certain family of probabilistic distributions.
How to model the uncertainty within the present-day technologies for regression remains an open issue.
arXiv Detail & Related papers (2021-03-25T06:56:09Z) - Versatile and Robust Transient Stability Assessment via Instance
Transfer Learning [6.760999627905228]
This paper introduces a new data collection method in a data-driven algorithm incorporating the knowledge of power system dynamics.
We introduce a new concept called Fault-Affected Area, which provides crucial information regarding the unstable region of operation.
The test results on the IEEE 39-bus system verify that this model can accurately predict the stability of previously unseen operational scenarios.
arXiv Detail & Related papers (2021-02-20T09:10:29Z) - Fine-Grained Analysis of Stability and Generalization for Stochastic
Gradient Descent [55.85456985750134]
We introduce a new stability measure called on-average model stability, for which we develop novel bounds controlled by the risks of SGD iterates.
This yields generalization bounds depending on the behavior of the best model, and leads to the first-ever-known fast bounds in the low-noise setting.
To our best knowledge, this gives the firstever-known stability and generalization for SGD with even non-differentiable loss functions.
arXiv Detail & Related papers (2020-06-15T06:30:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.