A Global Modeling Approach for Load Forecasting in Distribution Networks
- URL: http://arxiv.org/abs/2204.00493v1
- Date: Fri, 1 Apr 2022 14:51:50 GMT
- Title: A Global Modeling Approach for Load Forecasting in Distribution Networks
- Authors: Miha Grabner, Yi Wang, Qingsong Wen, Bo\v{s}tjan Bla\v{z}i\v{c},
Vitomir \v{S}truc
- Abstract summary: It is impractical to develop individual (or so-called local) forecasting models for each load separately.
This paper proposes a global modeling approach based on deep learning for efficient forecasting of a large number of loads in distribution networks.
- Score: 10.249757638247429
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Efficient load forecasting is needed to ensure better observability in the
distribution networks, whereas such forecasting is made possible by an
increasing number of smart meter installations. Because distribution networks
include a large amount of different loads at various aggregation levels, such
as individual consumers, transformer stations and feeders loads, it is
impractical to develop individual (or so-called local) forecasting models for
each load separately. Furthermore, such local models ignore the strong
dependencies between different loads that might be present due to their spatial
proximity and the characteristics of the distribution network. To address these
issues, this paper proposes a global modeling approach based on deep learning
for efficient forecasting of a large number of loads in distribution networks.
In this way, the computational burden of training a large amount of local
forecasting models can be largely reduced, and the cross-series information
shared among different loads can be utilized. Additionally, an unsupervised
localization mechanism and optimal ensemble construction strategy are also
proposed to localize/personalize the forecasting model to different groups of
loads and to improve the forecasting accuracy further. Comprehensive
experiments are conducted on real-world smart meter data to demonstrate the
superiority of the proposed approach compared to competing methods.
Related papers
- When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2023-10-17T20:30:16Z) - Distributionally Robust Machine Learning with Multi-source Data [6.383451076043423]
We introduce a group distributionally robust prediction model to optimize an adversarial reward about explained variance with respect to a class of target distributions.
Compared to classical empirical risk minimization, the proposed robust prediction model improves the prediction accuracy for target populations with distribution shifts.
We demonstrate the performance of our proposed group distributionally robust method on simulated and real data with random forests and neural networks as base-learning algorithms.
arXiv Detail & Related papers (2023-09-05T13:19:40Z) - Frugal day-ahead forecasting of multiple local electricity loads by
aggregating adaptive models [0.0]
We focus on day-ahead electricity load forecasting of substations of the distribution network in France.
We develop a frugal variant, reducing the number of parameters estimated, to achieve transfer learning.
We highlight the interpretability of the models, which is important for operational applications.
arXiv Detail & Related papers (2023-02-16T10:17:19Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2022-06-16T06:13:53Z) - Aggregating distribution forecasts from deep ensembles [0.0]
We study the question of how to aggregate distribution forecasts based on neural network-based approaches.
We show that combining forecast distributions can substantially improve the predictive performance.
We propose a general quantile aggregation framework for deep ensembles that shows superior performance compared to a linear combination of the forecast densities.
arXiv Detail & Related papers (2022-04-05T15:42:51Z) - End-to-End Trajectory Distribution Prediction Based on Occupancy Grid
Maps [29.67295706224478]
In this paper, we aim to forecast a future trajectory distribution of a moving agent in the real world, given the social scene images and historical trajectories.
We learn the distribution with symmetric cross-entropy using occupancy grid maps as an explicit and scene-compliant approximation to the ground-truth distribution.
In experiments, our method achieves state-of-the-art performance on the Stanford Drone dataset and Intersection Drone dataset.
arXiv Detail & Related papers (2022-03-31T09:24:32Z) - Test-time Collective Prediction [73.74982509510961]
Multiple parties in machine learning want to jointly make predictions on future test points.
Agents wish to benefit from the collective expertise of the full set of agents, but may not be willing to release their data or model parameters.
We explore a decentralized mechanism to make collective predictions at test time, leveraging each agent's pre-trained model.
arXiv Detail & Related papers (2021-06-22T18:29:58Z) - Probabilistic electric load forecasting through Bayesian Mixture Density
Networks [70.50488907591463]
Probabilistic load forecasting (PLF) is a key component in the extended tool-chain required for efficient management of smart energy grids.
We propose a novel PLF approach, framed on Bayesian Mixture Density Networks.
To achieve reliable and computationally scalable estimators of the posterior distributions, both Mean Field variational inference and deep ensembles are integrated.
arXiv Detail & Related papers (2020-12-23T16:21:34Z) - Generalization Properties of Optimal Transport GANs with Latent
Distribution Learning [52.25145141639159]
We study how the interplay between the latent distribution and the complexity of the pushforward map affects performance.
Motivated by our analysis, we advocate learning the latent distribution as well as the pushforward map within the GAN paradigm.
arXiv Detail & Related papers (2020-07-29T07:31:33Z) - Diversity inducing Information Bottleneck in Model Ensembles [73.80615604822435]
In this paper, we target the problem of generating effective ensembles of neural networks by encouraging diversity in prediction.
We explicitly optimize a diversity inducing adversarial loss for learning latent variables and thereby obtain diversity in the output predictions necessary for modeling multi-modal data.
Compared to the most competitive baselines, we show significant improvements in classification accuracy, under a shift in the data distribution.
arXiv Detail & Related papers (2020-03-10T03:10:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.