Solving Partial Differential Equations in Different Domains by Operator Learning method Based on Boundary Integral Equations
- URL: http://arxiv.org/abs/2406.02298v1
- Date: Tue, 4 Jun 2024 13:19:06 GMT
- Title: Solving Partial Differential Equations in Different Domains by Operator Learning method Based on Boundary Integral Equations
- Authors: Bin Meng, Yutong Lu, Ying Jiang,
- Abstract summary: This article explores operator learning models that can deduce solutions to partial differential equations (PDEs) on arbitrary domains without requiring retraining.
We introduce two innovative models rooted in boundary integral equations (BIEs)
Once fully trained, these BIE-based models adeptly predict the solutions of PDEs in any domain without the need for additional training.
- Score: 13.495279709392104
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This article explores operator learning models that can deduce solutions to partial differential equations (PDEs) on arbitrary domains without requiring retraining. We introduce two innovative models rooted in boundary integral equations (BIEs): the Boundary Integral Type Deep Operator Network (BI-DeepONet) and the Boundary Integral Trigonometric Deep Operator Neural Network (BI-TDONet), which are crafted to address PDEs across diverse domains. Once fully trained, these BIE-based models adeptly predict the solutions of PDEs in any domain without the need for additional training. BI-TDONet notably enhances its performance by employing the singular value decomposition (SVD) of bounded linear operators, allowing for the efficient distribution of input functions across its modules. Furthermore, to tackle the issue of function sampling values that do not effectively capture oscillatory and impulse signal characteristics, trigonometric coefficients are utilized as both inputs and outputs in BI-TDONet. Our numerical experiments robustly support and confirm the efficacy of this theoretical framework.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Quantitative Approximation for Neural Operators in Nonlinear Parabolic Equations [0.40964539027092917]
We derive the approximation rate of solution operators for the nonlinear parabolic partial differential equations (PDEs)
Our results show that neural operators can efficiently approximate these solution operators without the exponential growth in model complexity.
A key insight in our proof is to transfer PDEs into the corresponding integral equations via Duahamel's principle, and to leverage the similarity between neural operators and Picard's iteration.
arXiv Detail & Related papers (2024-10-03T02:28:17Z) - DeltaPhi: Learning Physical Trajectory Residual for PDE Solving [54.13671100638092]
We propose and formulate the Physical Trajectory Residual Learning (DeltaPhi)
We learn the surrogate model for the residual operator mapping based on existing neural operator networks.
We conclude that, compared to direct learning, physical residual learning is preferred for PDE solving.
arXiv Detail & Related papers (2024-06-14T07:45:07Z) - A Hybrid Kernel-Free Boundary Integral Method with Operator Learning for Solving Parametric Partial Differential Equations In Complex Domains [0.0]
Kernel-Free Boundary Integral (KFBI) method presents an iterative solution to boundary integral equations arising from elliptic partial differential equations (PDEs)
We propose a hybrid KFBI method, integrating the foundational principles of the KFBI method with the capabilities of deep learning.
arXiv Detail & Related papers (2024-04-23T17:25:35Z) - Pretraining Codomain Attention Neural Operators for Solving Multiphysics PDEs [85.40198664108624]
We propose Codomain Attention Neural Operator (CoDA-NO) to solve multiphysics problems with PDEs.
CoDA-NO tokenizes functions along the codomain or channel space, enabling self-supervised learning or pretraining of multiple PDE systems.
We find CoDA-NO to outperform existing methods by over 36% on complex downstream tasks with limited data.
arXiv Detail & Related papers (2024-03-19T08:56:20Z) - Neural Operators with Localized Integral and Differential Kernels [77.76991758980003]
We present a principled approach to operator learning that can capture local features under two frameworks.
We prove that we obtain differential operators under an appropriate scaling of the kernel values of CNNs.
To obtain local integral operators, we utilize suitable basis representations for the kernels based on discrete-continuous convolutions.
arXiv Detail & Related papers (2024-02-26T18:59:31Z) - GIT-Net: Generalized Integral Transform for Operator Learning [58.13313857603536]
This article introduces GIT-Net, a deep neural network architecture for approximating Partial Differential Equation (PDE) operators.
GIT-Net harnesses the fact that differential operators commonly used for defining PDEs can often be represented parsimoniously when expressed in specialized functional bases.
Numerical experiments demonstrate that GIT-Net is a competitive neural network operator, exhibiting small test errors and low evaluations across a range of PDE problems.
arXiv Detail & Related papers (2023-12-05T03:03:54Z) - Learning Only On Boundaries: a Physics-Informed Neural operator for
Solving Parametric Partial Differential Equations in Complex Geometries [10.250994619846416]
We present a novel physics-informed neural operator method to solve parametrized boundary value problems without labeled data.
Our numerical experiments show the effectiveness of parametrized complex geometries and unbounded problems.
arXiv Detail & Related papers (2023-08-24T17:29:57Z) - One-shot learning for solution operators of partial differential equations [3.559034814756831]
Learning and solving governing equations of a physical system, represented by partial differential equations (PDEs), from data is a central challenge in a variety of areas of science and engineering.
Traditional numerical methods for solving PDEs can be computationally expensive for complex systems and require the complete PDEs of the physical system.
Here, we propose the first solution operator learning method that only requires one PDE solution, i.e., one-shot learning.
arXiv Detail & Related papers (2021-04-06T17:35:10Z) - Neural Operator: Graph Kernel Network for Partial Differential Equations [57.90284928158383]
This work is to generalize neural networks so that they can learn mappings between infinite-dimensional spaces (operators)
We formulate approximation of the infinite-dimensional mapping by composing nonlinear activation functions and a class of integral operators.
Experiments confirm that the proposed graph kernel network does have the desired properties and show competitive performance compared to the state of the art solvers.
arXiv Detail & Related papers (2020-03-07T01:56:20Z) - Solving inverse-PDE problems with physics-aware neural networks [0.0]
We propose a novel framework to find unknown fields in the context of inverse problems for partial differential equations.
We blend the high expressibility of deep neural networks as universal function estimators with the accuracy and reliability of existing numerical algorithms.
arXiv Detail & Related papers (2020-01-10T18:46:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.