Probabilistic Forecasting with Coherent Aggregation
- URL: http://arxiv.org/abs/2307.09797v1
- Date: Wed, 19 Jul 2023 07:31:37 GMT
- Title: Probabilistic Forecasting with Coherent Aggregation
- Authors: Geoffrey N\'egiar and Ruijun Ma and O. Nangba Meetei and Mengfei Cao
and Michael W. Mahoney
- Abstract summary: We propose a new model which leverages a factor model structure to produce coherent forecasts by construction.
Our model achieves significant improvements: between $11.8-41.4%$ on three hierarchical forecasting datasets.
- Score: 41.861302072687835
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Obtaining accurate probabilistic forecasts while respecting hierarchical
information is an important operational challenge in many applications, perhaps
most obviously in energy management, supply chain planning, and resource
allocation. The basic challenge, especially for multivariate forecasting, is
that forecasts are often required to be coherent with respect to the
hierarchical structure. In this paper, we propose a new model which leverages a
factor model structure to produce coherent forecasts by construction. This is a
consequence of a simple (exchangeability) observation: permuting
\textit{}base-level series in the hierarchy does not change their aggregates.
Our model uses a convolutional neural network to produce parameters for the
factors, their loadings and base-level distributions; it produces samples which
can be differentiated with respect to the model's parameters; and it can
therefore optimize for any sample-based loss function, including the Continuous
Ranked Probability Score and quantile losses. We can choose arbitrary
continuous distributions for the factor and the base-level distributions. We
compare our method to two previous methods which can be optimized end-to-end,
while enforcing coherent aggregation. Our model achieves significant
improvements: between $11.8-41.4\%$ on three hierarchical forecasting datasets.
We also analyze the influence of parameters in our model with respect to
base-level distribution and number of factors.
Related papers
- When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2023-10-17T20:30:16Z) - Structured Radial Basis Function Network: Modelling Diversity for
Multiple Hypotheses Prediction [51.82628081279621]
Multi-modal regression is important in forecasting nonstationary processes or with a complex mixture of distributions.
A Structured Radial Basis Function Network is presented as an ensemble of multiple hypotheses predictors for regression problems.
It is proved that this structured model can efficiently interpolate this tessellation and approximate the multiple hypotheses target distribution.
arXiv Detail & Related papers (2023-09-02T01:27:53Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2022-06-16T06:13:53Z) - A Top-Down Approach to Hierarchically Coherent Probabilistic Forecasting [21.023456590248827]
We use a novel attention-based RNN model to learn the distribution of the proportions according to which each parent prediction is split among its children nodes at any point in time.
The resulting forecasts are computed in a top-down fashion and are naturally coherent.
arXiv Detail & Related papers (2022-04-21T21:32:28Z) - Distributional Gradient Boosting Machines [77.34726150561087]
Our framework is based on XGBoost and LightGBM.
We show that our framework achieves state-of-the-art forecast accuracy.
arXiv Detail & Related papers (2022-04-02T06:32:19Z) - MECATS: Mixture-of-Experts for Quantile Forecasts of Aggregated Time
Series [11.826510794042548]
We introduce a mixture of heterogeneous experts framework called textttMECATS.
It simultaneously forecasts the values of a set of time series that are related through an aggregation hierarchy.
Different types of forecasting models can be employed as individual experts so that the form of each model can be tailored to the nature of the corresponding time series.
arXiv Detail & Related papers (2021-12-22T05:05:30Z) - Probabilistic Hierarchical Forecasting with Deep Poisson Mixtures [2.1670528702668648]
We present a novel method capable of accurate and coherent probabilistic forecasts for time series when reliable hierarchical information is present.
We call it Deep Poisson Mixture Network (DPMN)
It relies on the combination of neural networks and a statistical model for the joint distribution of the hierarchical time series structure.
arXiv Detail & Related papers (2021-10-25T18:02:03Z) - Explaining a Series of Models by Propagating Local Feature Attributions [9.66840768820136]
Pipelines involving several machine learning models improve performance in many domains but are difficult to understand.
We introduce a framework to propagate local feature attributions through complex pipelines of models based on a connection to the Shapley value.
Our framework enables us to draw higher-level conclusions based on groups of gene expression features for Alzheimer's and breast cancer histologic grade prediction.
arXiv Detail & Related papers (2021-04-30T22:20:58Z) - On the Discrepancy between Density Estimation and Sequence Generation [92.70116082182076]
log-likelihood is highly correlated with BLEU when we consider models within the same family.
We observe no correlation between rankings of models across different families.
arXiv Detail & Related papers (2020-02-17T20:13:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.