Aggregating distribution forecasts from deep ensembles
- URL: http://arxiv.org/abs/2204.02291v1
- Date: Tue, 5 Apr 2022 15:42:51 GMT
- Title: Aggregating distribution forecasts from deep ensembles
- Authors: Benedikt Schulz and Sebastian Lerch
- Abstract summary: We study the question of how to aggregate distribution forecasts based on neural network-based approaches.
We show that combining forecast distributions can substantially improve the predictive performance.
We propose a general quantile aggregation framework for deep ensembles that shows superior performance compared to a linear combination of the forecast densities.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The importance of accurately quantifying forecast uncertainty has motivated
much recent research on probabilistic forecasting. In particular, a variety of
deep learning approaches has been proposed, with forecast distributions
obtained as output of neural networks. These neural network-based methods are
often used in the form of an ensemble based on multiple model runs from
different random initializations, resulting in a collection of forecast
distributions that need to be aggregated into a final probabilistic prediction.
With the aim of consolidating findings from the machine learning literature on
ensemble methods and the statistical literature on forecast combination, we
address the question of how to aggregate distribution forecasts based on such
deep ensembles. Using theoretical arguments, simulation experiments and a case
study on wind gust forecasting, we systematically compare probability- and
quantile-based aggregation methods for three neural network-based approaches
with different forecast distribution types as output. Our results show that
combining forecast distributions can substantially improve the predictive
performance. We propose a general quantile aggregation framework for deep
ensembles that shows superior performance compared to a linear combination of
the forecast densities. Finally, we investigate the effects of the ensemble
size and derive recommendations of aggregating distribution forecasts from deep
ensembles in practice.
Related papers
- When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2023-10-17T20:30:16Z) - Postprocessing of Ensemble Weather Forecasts Using Permutation-invariant
Neural Networks [0.0]
We propose networks that treat forecast ensembles as a set of unordered member forecasts.
We evaluate the quality of the obtained forecast distributions in terms of calibration and sharpness.
Our results suggest that most of the relevant information is contained in a few ensemble-internal degrees of freedom.
arXiv Detail & Related papers (2023-09-08T17:20:51Z) - Distributionally Robust Machine Learning with Multi-source Data [6.383451076043423]
We introduce a group distributionally robust prediction model to optimize an adversarial reward about explained variance with respect to a class of target distributions.
Compared to classical empirical risk minimization, the proposed robust prediction model improves the prediction accuracy for target populations with distribution shifts.
We demonstrate the performance of our proposed group distributionally robust method on simulated and real data with random forests and neural networks as base-learning algorithms.
arXiv Detail & Related papers (2023-09-05T13:19:40Z) - Structured Radial Basis Function Network: Modelling Diversity for
Multiple Hypotheses Prediction [51.82628081279621]
Multi-modal regression is important in forecasting nonstationary processes or with a complex mixture of distributions.
A Structured Radial Basis Function Network is presented as an ensemble of multiple hypotheses predictors for regression problems.
It is proved that this structured model can efficiently interpolate this tessellation and approximate the multiple hypotheses target distribution.
arXiv Detail & Related papers (2023-09-02T01:27:53Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2022-06-16T06:13:53Z) - Forecast combinations: an over 50-year review [16.590353808305245]
Forecast combinations have flourished remarkably in the forecasting community.
This paper provides an up-to-date review of the literature on forecast combinations.
We discuss the potential and limitations of various methods and highlight how these ideas have developed over time.
arXiv Detail & Related papers (2022-05-09T12:14:02Z) - A Top-Down Approach to Hierarchically Coherent Probabilistic Forecasting [21.023456590248827]
We use a novel attention-based RNN model to learn the distribution of the proportions according to which each parent prediction is split among its children nodes at any point in time.
The resulting forecasts are computed in a top-down fashion and are naturally coherent.
arXiv Detail & Related papers (2022-04-21T21:32:28Z) - Distributional Gradient Boosting Machines [77.34726150561087]
Our framework is based on XGBoost and LightGBM.
We show that our framework achieves state-of-the-art forecast accuracy.
arXiv Detail & Related papers (2022-04-02T06:32:19Z) - Learning Structured Gaussians to Approximate Deep Ensembles [10.055143995729415]
This paper proposes using a sparse-structured multivariate Gaussian to provide a closed-form approxorimator for dense image prediction tasks.
We capture the uncertainty and structured correlations in the predictions explicitly in a formal distribution, rather than implicitly through sampling alone.
We demonstrate the merits of our approach on monocular depth estimation and show that the advantages of our approach are obtained with comparable quantitative performance.
arXiv Detail & Related papers (2022-03-29T12:34:43Z) - Probabilistic electric load forecasting through Bayesian Mixture Density
Networks [70.50488907591463]
Probabilistic load forecasting (PLF) is a key component in the extended tool-chain required for efficient management of smart energy grids.
We propose a novel PLF approach, framed on Bayesian Mixture Density Networks.
To achieve reliable and computationally scalable estimators of the posterior distributions, both Mean Field variational inference and deep ensembles are integrated.
arXiv Detail & Related papers (2020-12-23T16:21:34Z) - Video Prediction via Example Guidance [156.08546987158616]
In video prediction tasks, one major challenge is to capture the multi-modal nature of future contents and dynamics.
In this work, we propose a simple yet effective framework that can efficiently predict plausible future states.
arXiv Detail & Related papers (2020-07-03T14:57:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.