Fully probabilistic deep models for forward and inverse problems in
parametric PDEs
- URL: http://arxiv.org/abs/2208.04856v2
- Date: Fri, 14 Jul 2023 09:20:18 GMT
- Title: Fully probabilistic deep models for forward and inverse problems in
parametric PDEs
- Authors: Arnaud Vadeboncoeur, \"Omer Deniz Akyildiz, Ieva Kazlauskaite, Mark
Girolami, Fehmi Cirak
- Abstract summary: We introduce a physics-driven deep latent variable model (PDDLVM) to learn simultaneously parameter-to-solution (forward) and solution-to- parameter (inverse) maps of PDEs.
The proposed framework can be easily extended to seamlessly integrate observed data to solve inverse problems and to build generative models.
We demonstrate the efficiency and robustness of our method on finite element discretized parametric PDE problems.
- Score: 1.9599274203282304
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a physics-driven deep latent variable model (PDDLVM) to learn
simultaneously parameter-to-solution (forward) and solution-to-parameter
(inverse) maps of parametric partial differential equations (PDEs). Our
formulation leverages conventional PDE discretization techniques, deep neural
networks, probabilistic modelling, and variational inference to assemble a
fully probabilistic coherent framework. In the posited probabilistic model,
both the forward and inverse maps are approximated as Gaussian distributions
with a mean and covariance parameterized by deep neural networks. The PDE
residual is assumed to be an observed random vector of value zero, hence we
model it as a random vector with a zero mean and a user-prescribed covariance.
The model is trained by maximizing the probability, that is the evidence or
marginal likelihood, of observing a residual of zero by maximizing the evidence
lower bound (ELBO). Consequently, the proposed methodology does not require any
independent PDE solves and is physics-informed at training time, allowing the
real-time solution of PDE forward and inverse problems after training. The
proposed framework can be easily extended to seamlessly integrate observed data
to solve inverse problems and to build generative models. We demonstrate the
efficiency and robustness of our method on finite element discretized
parametric PDE problems such as linear and nonlinear Poisson problems, elastic
shells with complex 3D geometries, and time-dependent nonlinear and
inhomogeneous PDEs using a physics-informed neural network (PINN)
discretization. We achieve up to three orders of magnitude speed-up after
training compared to traditional finite element method (FEM), while outputting
coherent uncertainty estimates.
Related papers
- Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - Reduced-order modeling for parameterized PDEs via implicit neural
representations [4.135710717238787]
We present a new data-driven reduced-order modeling approach to efficiently solve parametrized partial differential equations (PDEs)
The proposed framework encodes PDE and utilizes a parametrized neural ODE (PNODE) to learn latent dynamics characterized by multiple PDE parameters.
We evaluate the proposed method at a large Reynolds number and obtain up to speedup of O(103) and 1% relative error to the ground truth values.
arXiv Detail & Related papers (2023-11-28T01:35:06Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Physics-Informed Gaussian Process Regression Generalizes Linear PDE Solvers [32.57938108395521]
A class of mechanistic models, Linear partial differential equations, are used to describe physical processes such as heat transfer, electromagnetism, and wave propagation.
specialized numerical methods based on discretization are used to solve PDEs.
By ignoring parameter and measurement uncertainty, classical PDE solvers may fail to produce consistent estimates of their inherent approximation error.
arXiv Detail & Related papers (2022-12-23T17:02:59Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Semi-Implicit Neural Solver for Time-dependent Partial Differential
Equations [4.246966726709308]
We propose a neural solver to learn an optimal iterative scheme in a data-driven fashion for any class of PDEs.
We provide theoretical guarantees for the correctness and convergence of neural solvers analogous to conventional iterative solvers.
arXiv Detail & Related papers (2021-09-03T12:03:10Z) - Bayesian neural networks for weak solution of PDEs with uncertainty
quantification [3.4773470589069473]
A new physics-constrained neural network (NN) approach is proposed to solve PDEs without labels.
We write the loss function of NNs based on the discretized residual of PDEs through an efficient, convolutional operator-based, and vectorized implementation.
We demonstrate the capability and performance of the proposed framework by applying it to steady-state diffusion, linear elasticity, and nonlinear elasticity.
arXiv Detail & Related papers (2021-01-13T04:57:51Z) - Probabilistic learning on manifolds constrained by nonlinear partial
differential equations for small datasets [0.0]
A novel extension of the Probabilistic Learning on Manifolds (PLoM) is presented.
It makes it possible to synthesize solutions to a wide range of nonlinear boundary value problems.
Three applications are presented.
arXiv Detail & Related papers (2020-10-27T14:34:54Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Identification of Probability weighted ARX models with arbitrary domains [75.91002178647165]
PieceWise Affine models guarantees universal approximation, local linearity and equivalence to other classes of hybrid system.
In this work, we focus on the identification of PieceWise Auto Regressive with eXogenous input models with arbitrary regions (NPWARX)
The architecture is conceived following the Mixture of Expert concept, developed within the machine learning field.
arXiv Detail & Related papers (2020-09-29T12:50:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.