CRPS Learning
- URL: http://arxiv.org/abs/2102.00968v1
- Date: Mon, 1 Feb 2021 16:54:05 GMT
- Title: CRPS Learning
- Authors: Jonathan Berrisch, Florian Ziel
- Abstract summary: Combination and aggregation techniques can improve forecast accuracy substantially.
We discuss pointwise online aggregation algorithms that optimize with respect to the continuous ranked probability score (CRPS)
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Combination and aggregation techniques can improve forecast accuracy
substantially. This also holds for probabilistic forecasting methods where full
predictive distributions are combined. There are several time-varying and
adaptive weighting schemes like Bayesian model averaging (BMA). However, the
performance of different forecasters may vary not only over time but also in
parts of the distribution. So one may be more accurate in the center of the
distributions, and other ones perform better in predicting the distribution's
tails. Consequently, we introduce a new weighting procedure that considers both
varying performance across time and the distribution. We discuss pointwise
online aggregation algorithms that optimize with respect to the continuous
ranked probability score (CRPS). After analyzing the theoretical properties of
a fully adaptive Bernstein online aggregation (BOA) method, we introduce
smoothing procedures for pointwise CRPS learning. The properties are confirmed
and discussed using simulation studies. Additionally, we illustrate the
performance in a forecasting study for carbon markets. In detail, we predict
the distribution of European emission allowance prices.
Related papers
- Efficient pooling of predictions via kernel embeddings [0.24578723416255752]
Probabilistic predictions are probability distributions over the set of possible outcomes.
They are typically combined by linearly pooling the individual predictive distributions.
Weights assigned to each prediction can be estimated based on their past performance.
This can be achieved by finding the weights that optimise a proper scoring rule over some training data.
arXiv Detail & Related papers (2024-11-25T10:04:37Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2023-10-17T20:30:16Z) - Boosted Control Functions [10.503777692702952]
This work aims to bridge the gap between causal effect estimation and prediction tasks.
We establish a novel connection between the field of distribution from machine learning, and simultaneous equation models and control function from econometrics.
Within this framework, we propose a strong notion of invariance for a predictive model and compare it with existing (weaker) versions.
arXiv Detail & Related papers (2023-10-09T15:43:46Z) - Ensemble weather forecast post-processing with a flexible probabilistic
neural network approach [0.0]
We propose a novel, neural network-based method, which produces forecasts for all locations and lead times jointly.
We demonstrate the effectiveness of our method in the context of the EUPPBench benchmark.
arXiv Detail & Related papers (2023-03-29T15:18:00Z) - Scalable Dynamic Mixture Model with Full Covariance for Probabilistic Traffic Forecasting [14.951166842027819]
We propose a dynamic mixture of zero-mean Gaussian distributions for the time-varying error process.
The proposed method can be seamlessly integrated into existing deep-learning frameworks with only a few additional parameters to be learned.
We evaluate the proposed method on a traffic speed forecasting task and find that our method not only improves model horizons but also provides interpretabletemporal correlation structures.
arXiv Detail & Related papers (2022-12-10T22:50:00Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2022-06-16T06:13:53Z) - Distributionally Robust Models with Parametric Likelihood Ratios [123.05074253513935]
Three simple ideas allow us to train models with DRO using a broader class of parametric likelihood ratios.
We find that models trained with the resulting parametric adversaries are consistently more robust to subpopulation shifts when compared to other DRO approaches.
arXiv Detail & Related papers (2022-04-13T12:43:12Z) - Aggregating distribution forecasts from deep ensembles [0.0]
We propose a general quantile aggregation framework for deep ensembles.
We show that combining forecast distributions from deep ensembles can substantially improve the predictive performance.
arXiv Detail & Related papers (2022-04-05T15:42:51Z) - Distributional Gradient Boosting Machines [77.34726150561087]
Our framework is based on XGBoost and LightGBM.
We show that our framework achieves state-of-the-art forecast accuracy.
arXiv Detail & Related papers (2022-04-02T06:32:19Z) - Probabilistic Gradient Boosting Machines for Large-Scale Probabilistic
Regression [51.770998056563094]
Probabilistic Gradient Boosting Machines (PGBM) is a method to create probabilistic predictions with a single ensemble of decision trees.
We empirically demonstrate the advantages of PGBM compared to existing state-of-the-art methods.
arXiv Detail & Related papers (2021-06-03T08:32:13Z) - Probabilistic electric load forecasting through Bayesian Mixture Density
Networks [70.50488907591463]
Probabilistic load forecasting (PLF) is a key component in the extended tool-chain required for efficient management of smart energy grids.
We propose a novel PLF approach, framed on Bayesian Mixture Density Networks.
To achieve reliable and computationally scalable estimators of the posterior distributions, both Mean Field variational inference and deep ensembles are integrated.
arXiv Detail & Related papers (2020-12-23T16:21:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.