Error bounds for PDE-regularized learning
- URL: http://arxiv.org/abs/2003.06524v1
- Date: Sat, 14 Mar 2020 00:51:39 GMT
- Title: Error bounds for PDE-regularized learning
- Authors: Carsten Gr\"aser and Prem Anand Alathur Srinivasan
- Abstract summary: We consider the regularization of a supervised learning problem by partial differential equations (PDEs)
We derive error bounds for the obtained approximation in terms of a PDE error term and a data error term.
- Score: 0.6445605125467573
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work we consider the regularization of a supervised learning problem
by partial differential equations (PDEs) and derive error bounds for the
obtained approximation in terms of a PDE error term and a data error term.
Assuming that the target function satisfies an unknown PDE, the PDE error term
quantifies how well this PDE is approximated by the auxiliary PDE used for
regularization. It is shown that this error term decreases if more data is
provided. The data error term quantifies the accuracy of the given data.
Furthermore, the PDE-regularized learning problem is discretized by generalized
Galerkin discretizations solving the associated minimization problem in subsets
of the infinite dimensional functions space, which are not necessarily
subspaces. For such discretizations an error bound in terms of the PDE error,
the data error, and a best approximation error is derived.
Related papers
- Adaptation of uncertainty-penalized Bayesian information criterion for parametric partial differential equation discovery [1.1049608786515839]
We introduce an extension of the uncertainty-penalized Bayesian information criterion (UBIC) to solve parametric PDE discovery problems efficiently.
UBIC uses quantified PDE uncertainty over different temporal or spatial points to prevent overfitting in model selection.
We show that our extended UBIC can identify the true number of terms and their varying coefficients accurately, even in the presence of noise.
arXiv Detail & Related papers (2024-08-15T12:10:50Z) - DiffusionPDE: Generative PDE-Solving Under Partial Observation [10.87702379899977]
We introduce a general framework for solving partial differential equations (PDEs) using generative diffusion models.
We show that the learned generative priors lead to a versatile framework for accurately solving a wide range of PDEs under partial observation.
arXiv Detail & Related papers (2024-06-25T17:48:24Z) - Unisolver: PDE-Conditional Transformers Are Universal PDE Solvers [55.0876373185983]
We present the Universal PDE solver (Unisolver) capable of solving a wide scope of PDEs.
Our key finding is that a PDE solution is fundamentally under the control of a series of PDE components.
Unisolver achieves consistent state-of-the-art results on three challenging large-scale benchmarks.
arXiv Detail & Related papers (2024-05-27T15:34:35Z) - Data-Driven Discovery of PDEs via the Adjoint Method [4.014524824655106]
We present an adjoint-based method for discovering the underlying governing partial differential equations (PDEs) given data.
We show the efficacy of the proposed approach in identifying the form of the PDE.
We also compare its performance with the famous PDE Functional Identification of Dynamics method known as PDE-FIND.
arXiv Detail & Related papers (2024-01-30T17:10:42Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Weak-PDE-LEARN: A Weak Form Based Approach to Discovering PDEs From
Noisy, Limited Data [0.0]
We introduce Weak-PDE-LEARN, a discovery algorithm that can identify non-linear PDEs from noisy, limited measurements of their solutions.
We demonstrate the efficacy of Weak-PDE-LEARN by learning several benchmark PDEs.
arXiv Detail & Related papers (2023-09-09T06:45:15Z) - Error Analysis of Kernel/GP Methods for Nonlinear and Parametric PDEs [16.089904161628258]
We introduce a priori Sobolev-space error estimates for the solution of nonlinear, and possibly parametric, PDEs.
The proof is articulated around Sobolev norm error estimates for kernel interpolants.
The error estimates demonstrate dimension-benign convergence rates if the solution space of the PDE is smooth enough.
arXiv Detail & Related papers (2023-05-08T18:00:33Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Learning differentiable solvers for systems with hard constraints [48.54197776363251]
We introduce a practical method to enforce partial differential equation (PDE) constraints for functions defined by neural networks (NNs)
We develop a differentiable PDE-constrained layer that can be incorporated into any NN architecture.
Our results show that incorporating hard constraints directly into the NN architecture achieves much lower test error when compared to training on an unconstrained objective.
arXiv Detail & Related papers (2022-07-18T15:11:43Z) - Lie Point Symmetry Data Augmentation for Neural PDE Solvers [69.72427135610106]
We present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity.
In the context of PDEs, it turns out that we are able to quantitatively derive an exhaustive list of data transformations.
We show how it can easily be deployed to improve neural PDE solver sample complexity by an order of magnitude.
arXiv Detail & Related papers (2022-02-15T18:43:17Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.