DeepONet for Solving Nonlinear Partial Differential Equations with Physics-Informed Training
- URL: http://arxiv.org/abs/2410.04344v2
- Date: Wed, 22 Jan 2025 21:11:24 GMT
- Title: DeepONet for Solving Nonlinear Partial Differential Equations with Physics-Informed Training
- Authors: Yahong Yang,
- Abstract summary: We investigate the use of operator learning, specifically DeepONet, for solving nonlinear partial differential equations (PDEs)
This study examines the performance of DeepONet in physics-informed training, focusing on two key aspects: (1) the approximation capabilities of deep branch and trunk networks, and (2) the generalization error in Sobolev norms.
- Score: 2.44755919161855
- License:
- Abstract: In this paper, we investigate the use of operator learning, specifically DeepONet, for solving nonlinear partial differential equations (PDEs). Unlike conventional function learning methods that require training separate neural networks for each PDE, operator learning enables generalization across different PDEs without retraining. This study examines the performance of DeepONet in physics-informed training, focusing on two key aspects: (1) the approximation capabilities of deep branch and trunk networks, and (2) the generalization error in Sobolev norms. Our results demonstrate that deep branch networks provide substantial performance improvements, while trunk networks achieve optimal results when kept relatively simple. Furthermore, we derive a bound on the generalization error of DeepONet for solving nonlinear PDEs by analyzing the Rademacher complexity of its derivatives in terms of pseudo-dimension. This work bridges a critical theoretical gap by delivering rigorous error estimates. This paper fills a theoretical gap by providing error estimations for a wide range of physics-informed machine learning models and applications.
Related papers
- Quantifying Training Difficulty and Accelerating Convergence in Neural Network-Based PDE Solvers [9.936559796069844]
We investigate the training dynamics of neural network-based PDE solvers.
We find that two techniques, partition of unity (PoU) and variance scaling (VS) enhance the effective rank.
Experiments using popular PDE-solving frameworks, such as PINN, Deep Ritz, and the operator learning framework DeepOnet, confirm that these techniques consistently speed up convergence.
arXiv Detail & Related papers (2024-10-08T19:35:19Z) - DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Nearly Optimal VC-Dimension and Pseudo-Dimension Bounds for Deep Neural
Network Derivatives [13.300625539460217]
This paper addresses the problem of nearly optimal Vapnik--Chervonenkis dimension (VC-dimension) and pseudo-dimension estimations of the derivative functions of deep neural networks (DNNs)
Two important applications of these estimations include: 1) Establishing a nearly tight approximation result of DNNs in the Sobolev space; 2) Characterizing the generalization error of machine learning methods with loss functions involving function derivatives.
arXiv Detail & Related papers (2023-05-15T09:10:12Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Neural Basis Functions for Accelerating Solutions to High Mach Euler
Equations [63.8376359764052]
We propose an approach to solving partial differential equations (PDEs) using a set of neural networks.
We regress a set of neural networks onto a reduced order Proper Orthogonal Decomposition (POD) basis.
These networks are then used in combination with a branch network that ingests the parameters of the prescribed PDE to compute a reduced order approximation to the PDE.
arXiv Detail & Related papers (2022-08-02T18:27:13Z) - Generic bounds on the approximation error for physics-informed (and)
operator learning [7.6146285961466]
We propose a framework for deriving rigorous bounds on the approximation error for physics-informed neural networks (PINNs) and operator learning architectures such as DeepONets and FNOs.
These bounds guarantee that PINNs and (physics-informed) DeepONets or FNOs will efficiently approximate the underlying solution or solution operator of generic partial differential equations (PDEs)
arXiv Detail & Related papers (2022-05-23T15:40:33Z) - Multi-resolution partial differential equations preserved learning
framework for spatiotemporal dynamics [11.981731023317945]
Physics-informed deep learning (PiDL) addresses these challenges by incorporating physical principles into the model.
We propose to leverage physics prior knowledge by baking'' the discretized governing equations into the neural network architecture.
This method, embedding discretized PDEs through convolutional residual networks in a multi-resolution setting, largely improves the generalizability and long-term prediction.
arXiv Detail & Related papers (2022-05-09T01:27:58Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Improved architectures and training algorithms for deep operator
networks [0.0]
Operator learning techniques have emerged as a powerful tool for learning maps between infinite-dimensional Banach spaces.
We analyze the training dynamics of deep operator networks (DeepONets) through the lens of Neural Tangent Kernel (NTK) theory.
arXiv Detail & Related papers (2021-10-04T18:34:41Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.