Online detection of forecast model inadequacies using forecast errors
- URL: http://arxiv.org/abs/2502.14173v1
- Date: Thu, 20 Feb 2025 00:56:51 GMT
- Title: Online detection of forecast model inadequacies using forecast errors
- Authors: Thomas Grundy, Rebecca Killick, Ivan Svetunkov,
- Abstract summary: We present a novel framework for online monitoring of forecasts to ensure they remain accurate.
By utilizing sequential changepoint techniques on the forecast errors, our framework allows for the real-time identification of potential changes in the process caused by various external factors.
We show theoretically that some common changes in the underlying process will manifest in the forecast errors and can be identified faster by identifying shifts in the forecast errors than within the original modelling framework.
- Score: 0.0
- License:
- Abstract: In many organisations, accurate forecasts are essential for making informed decisions for a variety of applications from inventory management to staffing optimization. Whatever forecasting model is used, changes in the underlying process can lead to inaccurate forecasts, which will be damaging to decision-making. At the same time, models are becoming increasingly complex and identifying change through direct modelling is problematic. We present a novel framework for online monitoring of forecasts to ensure they remain accurate. By utilizing sequential changepoint techniques on the forecast errors, our framework allows for the real-time identification of potential changes in the process caused by various external factors. We show theoretically that some common changes in the underlying process will manifest in the forecast errors and can be identified faster by identifying shifts in the forecast errors than within the original modelling framework. Moreover, we demonstrate the effectiveness of this framework on numerous forecasting approaches through simulations and show its effectiveness over alternative approaches. Finally, we present two concrete examples, one from Royal Mail parcel delivery volumes and one from NHS A\&E admissions relating to gallstones.
Related papers
- Deconfounding Time Series Forecasting [1.5967186772129907]
Time series forecasting is a critical task in various domains, where accurate predictions can drive informed decision-making.
Traditional forecasting methods often rely on current observations of variables to predict future outcomes.
We propose an enhanced forecasting approach that incorporates representations of latent confounders derived from historical data.
arXiv Detail & Related papers (2024-10-27T12:45:42Z) - Rating Multi-Modal Time-Series Forecasting Models (MM-TSFM) for Robustness Through a Causal Lens [10.103561529332184]
We focus on multi-modal time-series forecasting, where imprecision due to noisy or incorrect data can lead to erroneous predictions.
We introduce a rating methodology to assess the robustness of Multi-Modal Time-Series Forecasting Models.
arXiv Detail & Related papers (2024-06-12T17:39:16Z) - Predictive Churn with the Set of Good Models [64.05949860750235]
We study the effect of conflicting predictions over the set of near-optimal machine learning models.
We present theoretical results on the expected churn between models within the Rashomon set.
We show how our approach can be used to better anticipate, reduce, and avoid churn in consumer-facing applications.
arXiv Detail & Related papers (2024-02-12T16:15:25Z) - Performative Time-Series Forecasting [71.18553214204978]
We formalize performative time-series forecasting (PeTS) from a machine-learning perspective.
We propose a novel approach, Feature Performative-Shifting (FPS), which leverages the concept of delayed response to anticipate distribution shifts.
We conduct comprehensive experiments using multiple time-series models on COVID-19 and traffic forecasting tasks.
arXiv Detail & Related papers (2023-10-09T18:34:29Z) - Quantification of Predictive Uncertainty via Inference-Time Sampling [57.749601811982096]
We propose a post-hoc sampling strategy for estimating predictive uncertainty accounting for data ambiguity.
The method can generate different plausible outputs for a given input and does not assume parametric forms of predictive distributions.
arXiv Detail & Related papers (2023-08-03T12:43:21Z) - Explain, Adapt and Retrain: How to improve the accuracy of a PPM
classifier through different explanation styles [4.6281736192809575]
Recent papers have introduced a novel approach to explain why a Predictive Process Monitoring model for outcome-oriented predictions provides wrong predictions.
We show how to exploit the explanations to identify the most common features that induce a predictor to make mistakes in a semi-automated way.
arXiv Detail & Related papers (2023-03-27T06:37:55Z) - A Closer Look at the Intervention Procedure of Concept Bottleneck Models [18.222350428973343]
Concept bottleneck models (CBMs) are a class of interpretable neural network models that predict the target response of a given input based on its high-level concepts.
CBMs enable domain experts to intervene on the predicted concepts and rectify any mistakes at test time, so that more accurate task predictions can be made at the end.
We develop various ways of selecting intervening concepts to improve the intervention effectiveness and conduct an array of in-depth analyses as to how they evolve under different circumstances.
arXiv Detail & Related papers (2023-02-28T02:37:24Z) - Predicting with Confidence on Unseen Distributions [90.68414180153897]
We connect domain adaptation and predictive uncertainty literature to predict model accuracy on challenging unseen distributions.
We find that the difference of confidences (DoC) of a classifier's predictions successfully estimates the classifier's performance change over a variety of shifts.
We specifically investigate the distinction between synthetic and natural distribution shifts and observe that despite its simplicity DoC consistently outperforms other quantifications of distributional difference.
arXiv Detail & Related papers (2021-07-07T15:50:18Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Estimating Generalization under Distribution Shifts via Domain-Invariant
Representations [75.74928159249225]
We use a set of domain-invariant predictors as a proxy for the unknown, true target labels.
The error of the resulting risk estimate depends on the target risk of the proxy model.
arXiv Detail & Related papers (2020-07-06T17:21:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.