Probabilistic electric load forecasting through Bayesian Mixture Density
Networks
- URL: http://arxiv.org/abs/2012.14389v2
- Date: Mon, 11 Jan 2021 10:19:04 GMT
- Title: Probabilistic electric load forecasting through Bayesian Mixture Density
Networks
- Authors: Alessandro Brusaferri and Matteo Matteucci and Stefano Spinelli and
Andrea Vitali
- Abstract summary: Probabilistic load forecasting (PLF) is a key component in the extended tool-chain required for efficient management of smart energy grids.
We propose a novel PLF approach, framed on Bayesian Mixture Density Networks.
To achieve reliable and computationally scalable estimators of the posterior distributions, both Mean Field variational inference and deep ensembles are integrated.
- Score: 70.50488907591463
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Probabilistic load forecasting (PLF) is a key component in the extended
tool-chain required for efficient management of smart energy grids. Neural
networks are widely considered to achieve improved prediction performances,
supporting highly flexible mappings of complex relationships between the target
and the conditioning variables set. However, obtaining comprehensive predictive
uncertainties from such black-box models is still a challenging and unsolved
problem. In this work, we propose a novel PLF approach, framed on Bayesian
Mixture Density Networks. Both aleatoric and epistemic uncertainty sources are
encompassed within the model predictions, inferring general conditional
densities, depending on the input features, within an end-to-end training
framework. To achieve reliable and computationally scalable estimators of the
posterior distributions, both Mean Field variational inference and deep
ensembles are integrated. Experiments have been performed on household
short-term load forecasting tasks, showing the capability of the proposed
method to achieve robust performances in different operating conditions.
Related papers
- When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2023-10-17T20:30:16Z) - Practical Probabilistic Model-based Deep Reinforcement Learning by
Integrating Dropout Uncertainty and Trajectory Sampling [7.179313063022576]
This paper addresses the prediction stability, prediction accuracy and control capability of the current probabilistic model-based reinforcement learning (MBRL) built on neural networks.
A novel approach dropout-based probabilistic ensembles with trajectory sampling (DPETS) is proposed.
arXiv Detail & Related papers (2023-09-20T06:39:19Z) - $\clubsuit$ CLOVER $\clubsuit$: Probabilistic Forecasting with Coherent Learning Objective Reparameterization [42.215158938066054]
We augment an MQForecaster neural network architecture with a modified multivariate Gaussian factor model that achieves coherence by construction.
We call our method the Coherent Learning Objective Reparametrization Neural Network (CLOVER)
In comparison to state-of-the-art coherent forecasting methods, CLOVER achieves significant improvements in scaled CRPS forecast accuracy, with average gains of 15%.
arXiv Detail & Related papers (2023-07-19T07:31:37Z) - Federated Conformal Predictors for Distributed Uncertainty
Quantification [83.50609351513886]
Conformal prediction is emerging as a popular paradigm for providing rigorous uncertainty quantification in machine learning.
In this paper, we extend conformal prediction to the federated learning setting.
We propose a weaker notion of partial exchangeability, better suited to the FL setting, and use it to develop the Federated Conformal Prediction framework.
arXiv Detail & Related papers (2023-05-27T19:57:27Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2022-06-16T06:13:53Z) - Aggregating distribution forecasts from deep ensembles [0.0]
We propose a general quantile aggregation framework for deep ensembles.
We show that combining forecast distributions from deep ensembles can substantially improve the predictive performance.
arXiv Detail & Related papers (2022-04-05T15:42:51Z) - Distributional Gradient Boosting Machines [77.34726150561087]
Our framework is based on XGBoost and LightGBM.
We show that our framework achieves state-of-the-art forecast accuracy.
arXiv Detail & Related papers (2022-04-02T06:32:19Z) - A Bayesian Deep Learning Approach to Near-Term Climate Prediction [12.870804083819603]
We pursue a complementary machine-learning-based approach to climate prediction.
In particular, we find that a feedforward convolutional network with a Densenet architecture is able to outperform a convolutional LSTM in terms of predictive skill.
arXiv Detail & Related papers (2022-02-23T00:28:36Z) - PDC-Net+: Enhanced Probabilistic Dense Correspondence Network [161.76275845530964]
Enhanced Probabilistic Dense Correspondence Network, PDC-Net+, capable of estimating accurate dense correspondences.
We develop an architecture and an enhanced training strategy tailored for robust and generalizable uncertainty prediction.
Our approach obtains state-of-the-art results on multiple challenging geometric matching and optical flow datasets.
arXiv Detail & Related papers (2021-09-28T17:56:41Z) - Calibration and Uncertainty Quantification of Bayesian Convolutional
Neural Networks for Geophysical Applications [0.0]
It is common to incorporate the uncertainty of predictions such subsurface models should provide calibrated probabilities and the associated uncertainties in their predictions.
It has been shown that popular Deep Learning-based models are often miscalibrated, and due to their deterministic nature, provide no means to interpret the uncertainty of their predictions.
We compare three different approaches obtaining probabilistic models based on convolutional neural networks in a Bayesian formalism.
arXiv Detail & Related papers (2021-05-25T17:54:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.