Quantitative Approximation for Neural Operators in Nonlinear Parabolic Equations
- URL: http://arxiv.org/abs/2410.02151v1
- Date: Thu, 3 Oct 2024 02:28:17 GMT
- Title: Quantitative Approximation for Neural Operators in Nonlinear Parabolic Equations
- Authors: Takashi Furuya, Koichi Taniguchi, Satoshi Okuda,
- Abstract summary: We derive the approximation rate of solution operators for the nonlinear parabolic partial differential equations (PDEs)
Our results show that neural operators can efficiently approximate these solution operators without the exponential growth in model complexity.
A key insight in our proof is to transfer PDEs into the corresponding integral equations via Duahamel's principle, and to leverage the similarity between neural operators and Picard's iteration.
- Score: 0.40964539027092917
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural operators serve as universal approximators for general continuous operators. In this paper, we derive the approximation rate of solution operators for the nonlinear parabolic partial differential equations (PDEs), contributing to the quantitative approximation theorem for solution operators of nonlinear PDEs. Our results show that neural operators can efficiently approximate these solution operators without the exponential growth in model complexity, thus strengthening the theoretical foundation of neural operators. A key insight in our proof is to transfer PDEs into the corresponding integral equations via Duahamel's principle, and to leverage the similarity between neural operators and Picard's iteration, a classical algorithm for solving PDEs. This approach is potentially generalizable beyond parabolic PDEs to a range of other equations, including the Navier-Stokes equation, nonlinear Schr\"odinger equations and nonlinear wave equations, which can be solved by Picard's iteration.
Related papers
- Generalized and new solutions of the NRT nonlinear Schrödinger equation [0.0]
We present new solutions of the non-linear Schr"oodinger equation proposed by Nobre, Rego-Monteiro and Tsallis for the free particle.
Analytical expressions for the wave function, the auxiliary field and the probability density are derived using a variety of approaches.
arXiv Detail & Related papers (2024-10-26T17:02:33Z) - DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Solving Partial Differential Equations in Different Domains by Operator Learning method Based on Boundary Integral Equations [13.495279709392104]
This article explores operator learning models that can deduce solutions to partial differential equations (PDEs) on arbitrary domains without requiring retraining.
We introduce two innovative models rooted in boundary integral equations (BIEs)
Once fully trained, these BIE-based models adeptly predict the solutions of PDEs in any domain without the need for additional training.
arXiv Detail & Related papers (2024-06-04T13:19:06Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Koopman neural operator as a mesh-free solver of non-linear partial differential equations [15.410070455154138]
We propose the Koopman neural operator (KNO), a new neural operator, to overcome these challenges.
By approximating the Koopman operator, an infinite-dimensional operator governing all possible observations of the dynamic system, we can equivalently learn the solution of a non-linear PDE family.
The KNO exhibits notable advantages compared with previous state-of-the-art models.
arXiv Detail & Related papers (2023-01-24T14:10:15Z) - Nonlinear Reconstruction for Operator Learning of PDEs with
Discontinuities [5.735035463793008]
A large class of hyperbolic and advection-dominated PDEs can have solutions with discontinuities.
We rigorously prove, in terms of lower approximation bounds, that methods which entail a linear reconstruction step fail to efficiently approximate the solution operator of such PDEs.
We show that certain methods employing a non-linear reconstruction mechanism can overcome these fundamental lower bounds and approximate the underlying operator efficiently.
arXiv Detail & Related papers (2022-10-03T16:47:56Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z) - Solving and Learning Nonlinear PDEs with Gaussian Processes [11.09729362243947]
We introduce a simple, rigorous, and unified framework for solving nonlinear partial differential equations.
The proposed approach provides a natural generalization of collocation kernel methods to nonlinear PDEs and IPs.
For IPs, while the traditional approach has been to iterate between the identifications of parameters in the PDE and the numerical approximation of its solution, our algorithm tackles both simultaneously.
arXiv Detail & Related papers (2021-03-24T03:16:08Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.