DeepONet-Grid-UQ: A Trustworthy Deep Operator Framework for Predicting
the Power Grid's Post-Fault Trajectories
- URL: http://arxiv.org/abs/2202.07176v1
- Date: Tue, 15 Feb 2022 04:01:57 GMT
- Title: DeepONet-Grid-UQ: A Trustworthy Deep Operator Framework for Predicting
the Power Grid's Post-Fault Trajectories
- Authors: Christian Moya, Shiqi Zhang, Meng Yue, and Guang Lin
- Abstract summary: This paper proposes a new data-driven method for the reliable prediction of power system post-fault trajectories.
The proposed method is based on the fundamentally new concept of Deep Operator Networks (DeepONets)
- Score: 10.972093683444648
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper proposes a new data-driven method for the reliable prediction of
power system post-fault trajectories. The proposed method is based on the
fundamentally new concept of Deep Operator Networks (DeepONets). Compared to
traditional neural networks that learn to approximate functions, DeepONets are
designed to approximate nonlinear operators. Under this operator framework, we
design a DeepONet to (1) take as inputs the fault-on trajectories collected,
for example, via simulation or phasor measurement units, and (2) provide as
outputs the predicted post-fault trajectories. In addition, we endow our method
with a much-needed ability to balance efficiency with reliable/trustworthy
predictions via uncertainty quantification. To this end, we propose and compare
two methods that enable quantifying the predictive uncertainty. First, we
propose a \textit{Bayesian DeepONet} (B-DeepONet) that uses stochastic gradient
Hamiltonian Monte-Carlo to sample from the posterior distribution of the
DeepONet parameters. Then, we propose a \textit{Probabilistic DeepONet}
(Prob-DeepONet) that uses a probabilistic training strategy to equip DeepONets
with a form of automated uncertainty quantification, at virtually no extra
computational cost. Finally, we validate the predictive power and uncertainty
quantification capability of the proposed B-DeepONet and Prob-DeepONet using
the IEEE 16-machine 68-bus system.
Related papers
- Unrolled denoising networks provably learn optimal Bayesian inference [54.79172096306631]
We prove the first rigorous learning guarantees for neural networks based on unrolling approximate message passing (AMP)
For compressed sensing, we prove that when trained on data drawn from a product prior, the layers of the network converge to the same denoisers used in Bayes AMP.
arXiv Detail & Related papers (2024-09-19T17:56:16Z) - Conformalized-DeepONet: A Distribution-Free Framework for Uncertainty
Quantification in Deep Operator Networks [7.119066725173193]
We use conformal prediction to obtain confidence prediction intervals with coverage guarantees for Deep Operator Network (DeepONet) regression.
We design a novel Quantile-DeepONet that allows for a more natural use of split conformal prediction.
We demonstrate the effectiveness of the proposed methods using various ordinary, partial differential equation numerical examples.
arXiv Detail & Related papers (2024-02-23T16:07:39Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Reliable extrapolation of deep neural operators informed by physics or
sparse observations [2.887258133992338]
Deep neural operators can learn nonlinear mappings between infinite-dimensional function spaces via deep neural networks.
DeepONets provide a new simulation paradigm in science and engineering.
We propose five reliable learning methods that guarantee a safe prediction under extrapolation.
arXiv Detail & Related papers (2022-12-13T03:02:46Z) - Variational Bayes Deep Operator Network: A data-driven Bayesian solver
for parametric differential equations [0.0]
We propose Variational Bayes DeepONet (VB-DeepONet) for operator learning.
VB-DeepONet uses variational inference to take into account high dimensional posterior distributions.
arXiv Detail & Related papers (2022-06-12T04:20:11Z) - Density Regression and Uncertainty Quantification with Bayesian Deep
Noise Neural Networks [4.376565880192482]
Deep neural network (DNN) models have achieved state-of-the-art predictive accuracy in a wide range of supervised learning applications.
accurately quantifying the uncertainty in DNN predictions remains a challenging task.
We propose the Bayesian Deep Noise Neural Network (B-DeepNoise), which generalizes standard Bayesian DNNs by extending the random noise variable to all hidden layers.
We evaluate B-DeepNoise against existing methods on benchmark regression datasets, demonstrating its superior performance in terms of prediction accuracy, uncertainty quantification accuracy, and uncertainty quantification efficiency.
arXiv Detail & Related papers (2022-06-12T02:47:29Z) - Scalable Uncertainty Quantification for Deep Operator Networks using
Randomized Priors [14.169588600819546]
We present a simple and effective approach for posterior uncertainty quantification in deep operator networks (DeepONets)
We adopt a frequentist approach based on randomized prior ensembles, and put forth an efficient vectorized implementation for fast parallel inference on accelerated hardware.
arXiv Detail & Related papers (2022-03-06T20:48:16Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Probabilistic electric load forecasting through Bayesian Mixture Density
Networks [70.50488907591463]
Probabilistic load forecasting (PLF) is a key component in the extended tool-chain required for efficient management of smart energy grids.
We propose a novel PLF approach, framed on Bayesian Mixture Density Networks.
To achieve reliable and computationally scalable estimators of the posterior distributions, both Mean Field variational inference and deep ensembles are integrated.
arXiv Detail & Related papers (2020-12-23T16:21:34Z) - ESPN: Extremely Sparse Pruned Networks [50.436905934791035]
We show that a simple iterative mask discovery method can achieve state-of-the-art compression of very deep networks.
Our algorithm represents a hybrid approach between single shot network pruning methods and Lottery-Ticket type approaches.
arXiv Detail & Related papers (2020-06-28T23:09:27Z) - Uncertainty Estimation Using a Single Deep Deterministic Neural Network [66.26231423824089]
We propose a method for training a deterministic deep model that can find and reject out of distribution data points at test time with a single forward pass.
We scale training in these with a novel loss function and centroid updating scheme and match the accuracy of softmax models.
arXiv Detail & Related papers (2020-03-04T12:27:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.