Gaussian Process Learning-based Probabilistic Optimal Power Flow
- URL: http://arxiv.org/abs/2004.07757v1
- Date: Thu, 16 Apr 2020 16:49:31 GMT
- Title: Gaussian Process Learning-based Probabilistic Optimal Power Flow
- Authors: Parikshit Pareek and Hung D. Nguyen
- Abstract summary: The proposed method relies on a non-parametric Bayesian inference-based uncertainty propagation approach, called Gaussian Process (GP)
The proposed method provides reasonably accurate solutions when compared with Monte-Carlo Simulations (MCS) solutions at different levels of uncertain renewable penetration as well as load uncertainties.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this letter, we present a novel Gaussian Process Learning-based
Probabilistic Optimal Power Flow (GP-POPF) for solving POPF under renewable and
load uncertainties of arbitrary distribution. The proposed method relies on a
non-parametric Bayesian inference-based uncertainty propagation approach,
called Gaussian Process (GP). We also suggest a new type of sensitivity called
Subspace-wise Sensitivity, using observations on the interpretability of
GP-POPF hyperparameters. The simulation results on 14-bus and 30-bus systems
show that the proposed method provides reasonably accurate solutions when
compared with Monte-Carlo Simulations (MCS) solutions at different levels of
uncertain renewable penetration as well as load uncertainties, while requiring
much less number of samples and elapsed time.
Related papers
- Uncertainty Quantification via Stable Distribution Propagation [60.065272548502]
We propose a new approach for propagating stable probability distributions through neural networks.
Our method is based on local linearization, which we show to be an optimal approximation in terms of total variation distance for the ReLU non-linearity.
arXiv Detail & Related papers (2024-02-13T09:40:19Z) - Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - Entropic Matching for Expectation Propagation of Markov Jump Processes [38.60042579423602]
We propose a new tractable inference scheme based on an entropic matching framework.
We demonstrate the effectiveness of our method by providing closed-form results for a simple family of approximate distributions.
We derive expressions for point estimation of the underlying parameters using an approximate expectation procedure.
arXiv Detail & Related papers (2023-09-27T12:07:21Z) - Neural Operator Variational Inference based on Regularized Stein
Discrepancy for Deep Gaussian Processes [23.87733307119697]
We introduce Neural Operator Variational Inference (NOVI) for Deep Gaussian Processes.
NOVI uses a neural generator to obtain a sampler and minimizes the Regularized Stein Discrepancy in L2 space between the generated distribution and true posterior.
We demonstrate that the bias introduced by our method can be controlled by multiplying the divergence with a constant, which leads to robust error control and ensures the stability and precision of the algorithm.
arXiv Detail & Related papers (2023-09-22T06:56:35Z) - Noise-Free Sampling Algorithms via Regularized Wasserstein Proximals [3.4240632942024685]
We consider the problem of sampling from a distribution governed by a potential function.
This work proposes an explicit score based MCMC method that is deterministic, resulting in a deterministic evolution for particles.
arXiv Detail & Related papers (2023-08-28T23:51:33Z) - Piecewise Deterministic Markov Processes for Bayesian Neural Networks [20.865775626533434]
Inference on modern Bayesian Neural Networks (BNNs) often relies on a variational inference treatment, imposing violated assumptions of independence and the form of the posterior.
New Piecewise Deterministic Markov Process (PDMP) samplers permit subsampling, though introduce a model specific inhomogenous Poisson Process (IPPs) which is difficult to sample from.
This work introduces a new generic and adaptive thinning scheme for sampling from IPPs, and demonstrates how this approach can accelerate the application of PDMPs for inference in BNNs.
arXiv Detail & Related papers (2023-02-17T06:38:16Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Variational methods for simulation-based inference [3.308743964406687]
Sequential Neural Variational Inference (SNVI) is an approach to perform Bayesian inference in models with intractable likelihoods.
SNVI combines likelihood-estimation with variational inference to achieve a scalable simulation-based inference approach.
arXiv Detail & Related papers (2022-03-08T16:06:37Z) - Variational Refinement for Importance Sampling Using the Forward
Kullback-Leibler Divergence [77.06203118175335]
Variational Inference (VI) is a popular alternative to exact sampling in Bayesian inference.
Importance sampling (IS) is often used to fine-tune and de-bias the estimates of approximate Bayesian inference procedures.
We propose a novel combination of optimization and sampling techniques for approximate Bayesian inference.
arXiv Detail & Related papers (2021-06-30T11:00:24Z) - Sampling-free Variational Inference for Neural Networks with
Multiplicative Activation Noise [51.080620762639434]
We propose a more efficient parameterization of the posterior approximation for sampling-free variational inference.
Our approach yields competitive results for standard regression problems and scales well to large-scale image classification tasks.
arXiv Detail & Related papers (2021-03-15T16:16:18Z) - Likelihood-Free Inference with Deep Gaussian Processes [70.74203794847344]
Surrogate models have been successfully used in likelihood-free inference to decrease the number of simulator evaluations.
We propose a Deep Gaussian Process (DGP) surrogate model that can handle more irregularly behaved target distributions.
Our experiments show how DGPs can outperform GPs on objective functions with multimodal distributions and maintain a comparable performance in unimodal cases.
arXiv Detail & Related papers (2020-06-18T14:24:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.