Learning Partial Differential Equations by Spectral Approximates of
General Sobolev Spaces
- URL: http://arxiv.org/abs/2301.04887v1
- Date: Thu, 12 Jan 2023 09:04:32 GMT
- Title: Learning Partial Differential Equations by Spectral Approximates of
General Sobolev Spaces
- Authors: Juan-Esteban Suarez Cardona, Phil-Alexander Hofmann and Michael Hecht
- Abstract summary: We introduce a novel spectral, finite-dimensional approximation of general Sobolev spaces in terms of Chebyshevs.
We find a variational formulation, solving a vast class of linear and non-linear partial differential equations.
In contrast to PINNs, the PSMs result in a convex optimisation problem for a vast class of PDEs, including all linear ones.
- Score: 0.45880283710344055
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a novel spectral, finite-dimensional approximation of general
Sobolev spaces in terms of Chebyshev polynomials. Based on this polynomial
surrogate model (PSM), we realise a variational formulation, solving a vast
class of linear and non-linear partial differential equations (PDEs). The PSMs
are as flexible as the physics-informed neural nets (PINNs) and provide an
alternative for addressing inverse PDE problems, such as PDE-parameter
inference. In contrast to PINNs, the PSMs result in a convex optimisation
problem for a vast class of PDEs, including all linear ones, in which case the
PSM-approximate is efficiently computable due to the exponential convergence
rate of the underlying variational gradient descent.
As a practical consequence prominent PDE problems were resolved by the PSMs
without High Performance Computing (HPC) on a local machine. This gain in
efficiency is complemented by an increase of approximation power, outperforming
PINN alternatives in both accuracy and runtime.
Beyond the empirical evidence we give here, the translation of classic PDE
theory in terms of the Sobolev space approximates suggests the PSMs to be
universally applicable to well-posed, regular forward and inverse PDE problems.
Related papers
- Unisolver: PDE-Conditional Transformers Are Universal PDE Solvers [55.0876373185983]
We present the Universal PDE solver (Unisolver) capable of solving a wide scope of PDEs.
Our key finding is that a PDE solution is fundamentally under the control of a series of PDE components.
Unisolver achieves consistent state-of-the-art results on three challenging large-scale benchmarks.
arXiv Detail & Related papers (2024-05-27T15:34:35Z) - RoPINN: Region Optimized Physics-Informed Neural Networks [66.38369833561039]
Physics-informed neural networks (PINNs) have been widely applied to solve partial differential equations (PDEs)
This paper proposes and theoretically studies a new training paradigm as region optimization.
A practical training algorithm, Region Optimized PINN (RoPINN), is seamlessly derived from this new paradigm.
arXiv Detail & Related papers (2024-05-23T09:45:57Z) - Approximation of Solution Operators for High-dimensional PDEs [2.3076986663832044]
We propose a finite-dimensional control-based method to approximate solution operators for evolutional partial differential equations.
Results are presented for several high-dimensional PDEs, including real-world applications to solving Hamilton-Jacobi-Bellman equations.
arXiv Detail & Related papers (2024-01-18T21:45:09Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Lie Point Symmetry and Physics Informed Networks [59.56218517113066]
We propose a loss function that informs the network about Lie point symmetries in the same way that PINN models try to enforce the underlying PDE through a loss function.
Our symmetry loss ensures that the infinitesimal generators of the Lie group conserve the PDE solutions.
Empirical evaluations indicate that the inductive bias introduced by the Lie point symmetries of the PDEs greatly boosts the sample efficiency of PINNs.
arXiv Detail & Related papers (2023-11-07T19:07:16Z) - Error Analysis of Kernel/GP Methods for Nonlinear and Parametric PDEs [16.089904161628258]
We introduce a priori Sobolev-space error estimates for the solution of nonlinear, and possibly parametric, PDEs.
The proof is articulated around Sobolev norm error estimates for kernel interpolants.
The error estimates demonstrate dimension-benign convergence rates if the solution space of the PDE is smooth enough.
arXiv Detail & Related papers (2023-05-08T18:00:33Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Parameter Inference based on Gaussian Processes Informed by Nonlinear
Partial Differential Equations [6.230751621285322]
Partial differential equations (PDEs) are widely used for the description of physical and engineering phenomena.
Some key parameters involved in PDEs, which represent certain physical properties with important scientific interpretations, are difficult or even impossible to measure directly.
We propose a novel method for the inference of unknown parameters in PDEs, called the PDE-Informed Gaussian Process (PIGP) based parameter inference method.
arXiv Detail & Related papers (2022-12-22T17:14:51Z) - Lie Point Symmetry Data Augmentation for Neural PDE Solvers [69.72427135610106]
We present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity.
In the context of PDEs, it turns out that we are able to quantitatively derive an exhaustive list of data transformations.
We show how it can easily be deployed to improve neural PDE solver sample complexity by an order of magnitude.
arXiv Detail & Related papers (2022-02-15T18:43:17Z) - Solving and Learning Nonlinear PDEs with Gaussian Processes [11.09729362243947]
We introduce a simple, rigorous, and unified framework for solving nonlinear partial differential equations.
The proposed approach provides a natural generalization of collocation kernel methods to nonlinear PDEs and IPs.
For IPs, while the traditional approach has been to iterate between the identifications of parameters in the PDE and the numerical approximation of its solution, our algorithm tackles both simultaneously.
arXiv Detail & Related papers (2021-03-24T03:16:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.