On Forecast Stability
- URL: http://arxiv.org/abs/2310.17332v1
- Date: Thu, 26 Oct 2023 11:55:30 GMT
- Title: On Forecast Stability
- Authors: Rakshitha Godahewa, Christoph Bergmeir, Zeynep Erkin Baz, Chengjun
Zhu, Zhangdi Song, Salvador Garc\'ia, Dario Benavides
- Abstract summary: We explore two types of forecast stability that we call vertical stability and horizontal stability.
We propose a simple linear-interpolation-based approach that is applicable to stabilise the forecasts provided by any base model vertically and horizontally.
- Score: 3.5789787644375495
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Forecasts are typically not produced in a vacuum but in a business context,
where forecasts are generated on a regular basis and interact with each other.
For decisions, it may be important that forecasts do not change arbitrarily,
and are stable in some sense. However, this area has received only limited
attention in the forecasting literature. In this paper, we explore two types of
forecast stability that we call vertical stability and horizontal stability.
The existing works in the literature are only applicable to certain base models
and extending these frameworks to be compatible with any base model is not
straightforward. Furthermore, these frameworks can only stabilise the forecasts
vertically. To fill this gap, we propose a simple linear-interpolation-based
approach that is applicable to stabilise the forecasts provided by any base
model vertically and horizontally. The approach can produce both accurate and
stable forecasts. Using N-BEATS, Pooled Regression and LightGBM as the base
models, in our evaluation on four publicly available datasets, the proposed
framework is able to achieve significantly higher stability and/or accuracy
compared to a set of benchmarks including a state-of-the-art forecast
stabilisation method across three error metrics and six stability metrics.
Related papers
- On the Selection Stability of Stability Selection and Its Applications [2.263635133348731]
This paper seeks to broaden the use of an established stability estimator to evaluate the overall stability of the stability selection framework.
We suggest that the stability estimator offers two advantages: it can serve as a reference to reflect the robustness of the outcomes obtained and help identify an optimal regularization value to improve stability.
arXiv Detail & Related papers (2024-11-14T00:02:54Z) - Calibrated Probabilistic Forecasts for Arbitrary Sequences [58.54729945445505]
Real-world data streams can change unpredictably due to distribution shifts, feedback loops and adversarial actors.
We present a forecasting framework ensuring valid uncertainty estimates regardless of how data evolves.
arXiv Detail & Related papers (2024-09-27T21:46:42Z) - Using dynamic loss weighting to boost improvements in forecast stability [0.9332308328407303]
Rolling origin forecast instability refers to variability in forecasts for a specific period induced by updating the forecast.
In this paper, we empirically investigate whether further improvements in stability can be obtained without compromising accuracy by applying dynamic loss weighting algorithms.
arXiv Detail & Related papers (2024-09-26T20:21:46Z) - Probabilistic Scores of Classifiers, Calibration is not Enough [0.32985979395737786]
In binary classification tasks, accurate representation of probabilistic predictions is essential for various real-world applications.
In this study, we highlight approaches that prioritize the alignment between predicted scores and true probability distributions.
Our findings reveal limitations in traditional calibration metrics, which could undermine the reliability of predictive models for critical decision-making.
arXiv Detail & Related papers (2024-08-06T19:53:00Z) - Stability Evaluation via Distributional Perturbation Analysis [28.379994938809133]
We propose a stability evaluation criterion based on distributional perturbations.
Our stability evaluation criterion can address both emphdata corruptions and emphsub-population shifts.
Empirically, we validate the practical utility of our stability evaluation criterion across a host of real-world applications.
arXiv Detail & Related papers (2024-05-06T06:47:14Z) - Stable Update of Regression Trees [0.0]
We focus on the stability of an inherently explainable machine learning method, namely regression trees.
We propose a regularization method, where data points are weighted based on the uncertainty in the initial model.
Results show that the proposed update method improves stability while achieving similar or better predictive performance.
arXiv Detail & Related papers (2024-02-21T09:41:56Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2023-10-17T20:30:16Z) - Toward Reliable Human Pose Forecasting with Uncertainty [51.628234388046195]
We develop an open-source library for human pose forecasting, including multiple models, supporting several datasets.
We devise two types of uncertainty in the problem to increase performance and convey better trust.
arXiv Detail & Related papers (2023-04-13T17:56:08Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2022-06-16T06:13:53Z) - Learning Probabilistic Ordinal Embeddings for Uncertainty-Aware
Regression [91.3373131262391]
Uncertainty is the only certainty there is.
Traditionally, the direct regression formulation is considered and the uncertainty is modeled by modifying the output space to a certain family of probabilistic distributions.
How to model the uncertainty within the present-day technologies for regression remains an open issue.
arXiv Detail & Related papers (2021-03-25T06:56:09Z) - Fine-Grained Analysis of Stability and Generalization for Stochastic
Gradient Descent [55.85456985750134]
We introduce a new stability measure called on-average model stability, for which we develop novel bounds controlled by the risks of SGD iterates.
This yields generalization bounds depending on the behavior of the best model, and leads to the first-ever-known fast bounds in the low-noise setting.
To our best knowledge, this gives the firstever-known stability and generalization for SGD with even non-differentiable loss functions.
arXiv Detail & Related papers (2020-06-15T06:30:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.