On the estimation rate of Bayesian PINN for inverse problems
- URL: http://arxiv.org/abs/2406.14808v1
- Date: Fri, 21 Jun 2024 01:13:18 GMT
- Title: On the estimation rate of Bayesian PINN for inverse problems
- Authors: Yi Sun, Debarghya Mukherjee, Yves Atchade,
- Abstract summary: Solving partial differential equations (PDEs) and their inverse problems using Physics-informed neural networks (PINNs) is a rapidly growing approach in the physics and machine learning community.
We study the behavior of a Bayesian PINN estimator of the solution of a PDE from $n$ independent noisy measurement of the solution.
- Score: 10.100602879566782
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Solving partial differential equations (PDEs) and their inverse problems using Physics-informed neural networks (PINNs) is a rapidly growing approach in the physics and machine learning community. Although several architectures exist for PINNs that work remarkably in practice, our theoretical understanding of their performances is somewhat limited. In this work, we study the behavior of a Bayesian PINN estimator of the solution of a PDE from $n$ independent noisy measurement of the solution. We focus on a class of equations that are linear in their parameters (with unknown coefficients $\theta_\star$). We show that when the partial differential equation admits a classical solution (say $u_\star$), differentiable to order $\beta$, the mean square error of the Bayesian posterior mean is at least of order $n^{-2\beta/(2\beta + d)}$. Furthermore, we establish a convergence rate of the linear coefficients of $\theta_\star$ depending on the order of the underlying differential operator. Last but not least, our theoretical results are validated through extensive simulations.
Related papers
- Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - A backward differential deep learning-based algorithm for solving high-dimensional nonlinear backward stochastic differential equations [0.6040014326756179]
We propose a novel backward differential deep learning-based algorithm for solving high-dimensional nonlinear backward differential equations.
The deep neural network (DNN) models are trained not only on the inputs and labels but also the differentials of the corresponding labels.
arXiv Detail & Related papers (2024-04-12T13:05:35Z) - Generating synthetic data for neural operators [0.0]
We propose a different approach to generating synthetic functional training data that does not require solving a PDE numerically.
While the idea is simple, we hope this method will expand the potential for developing neural PDE solvers that do not depend on classical numerical solvers.
arXiv Detail & Related papers (2024-01-04T18:31:21Z) - Learning Only On Boundaries: a Physics-Informed Neural operator for
Solving Parametric Partial Differential Equations in Complex Geometries [10.250994619846416]
We present a novel physics-informed neural operator method to solve parametrized boundary value problems without labeled data.
Our numerical experiments show the effectiveness of parametrized complex geometries and unbounded problems.
arXiv Detail & Related papers (2023-08-24T17:29:57Z) - Learning-based solutions to nonlinear hyperbolic PDEs: Empirical
insights on generalization errors [1.0312968200748118]
We study learning weak solutions to nonlinear hyperbolic partial differential equations (H-PDE)
$pi$-FNO generalizes well to unseen initial and boundary conditions.
Adding a physics-informed regularizer improved the prediction of discontinuities in the solution.
arXiv Detail & Related papers (2023-02-16T08:44:17Z) - Learning to Solve PDE-constrained Inverse Problems with Graph Networks [51.89325993156204]
In many application domains across science and engineering, we are interested in solving inverse problems with constraints defined by a partial differential equation (PDE)
Here we explore GNNs to solve such PDE-constrained inverse problems.
We demonstrate computational speedups of up to 90x using GNNs compared to principled solvers.
arXiv Detail & Related papers (2022-06-01T18:48:01Z) - A Law of Robustness beyond Isoperimetry [84.33752026418045]
We prove a Lipschitzness lower bound $Omega(sqrtn/p)$ of robustness of interpolating neural network parameters on arbitrary distributions.
We then show the potential benefit of overparametrization for smooth data when $n=mathrmpoly(d)$.
We disprove the potential existence of an $O(1)$-Lipschitz robust interpolating function when $n=exp(omega(d))$.
arXiv Detail & Related papers (2022-02-23T16:10:23Z) - Locality defeats the curse of dimensionality in convolutional
teacher-student scenarios [69.2027612631023]
We show that locality is key in determining the learning curve exponent $beta$.
We conclude by proving, using a natural assumption, that performing kernel regression with a ridge that decreases with the size of the training set leads to similar learning curve exponents to those we obtain in the ridgeless case.
arXiv Detail & Related papers (2021-06-16T08:27:31Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z) - nPINNs: nonlocal Physics-Informed Neural Networks for a parametrized
nonlocal universal Laplacian operator. Algorithms and Applications [0.0]
Physics-informed neural networks (PINNs) are effective in solving inverse problems based on differential and integral equations with sparse, unstructured and multi-fidelity data.
In this paper, we extend PINNs to parameters and function for integral equations nonlocal Poisson and nonlocal turbulence.
Our results show that nPINNs can jointly infer this function as well as $delta$.
arXiv Detail & Related papers (2020-04-08T21:48:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.