Diffeomorphism Neural Operator for various domains and parameters of partial differential equations
- URL: http://arxiv.org/abs/2402.12475v2
- Date: Thu, 20 Jun 2024 04:08:36 GMT
- Title: Diffeomorphism Neural Operator for various domains and parameters of partial differential equations
- Authors: Zhiwei Zhao, Changqing Liu, Yingguang Li, Zhibin Chen, Xu Liu,
- Abstract summary: This paper presents a novel neural operator learning framework for solving partial differential equations (PDEs) with various domains and parameters defined for physical systems, named diffeomorphism neural operator (DNO)
The DNO framework demonstrated robust learning capabilities and strong generalization performance across various domains and parameters.
- Score: 9.237670541637192
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In scientific and engineering applications, solving partial differential equations (PDEs) across various parameters and domains normally relies on resource-intensive numerical methods. Neural operators based on deep learning offered a promising alternative to PDEs solving by directly learning physical laws from data. However, the current neural operator methods were limited to solve PDEs on fixed domains. Expanding neural operators to solve PDEs on various domains hold significant promise in medical imaging, engineering design and manufacturing applications, where geometric and parameter changes are essential. This paper presents a novel neural operator learning framework for solving PDEs with various domains and parameters defined for physical systems, named diffeomorphism neural operator (DNO). The main idea is that a neural operator learns in a generic domain which is diffeomorphically mapped from various physics domains expressed by the same PDE. In this way, the challenge of operator learning on various domains is transformed into operator learning on the generic domain. The generalization performance of DNO on different domains can be assessed by a proposed method which evaluates the geometric similarity between a new domain and the domains of training dataset after diffeomorphism. Experiments on Darcy flow, pipe flow, airfoil flow and mechanics were carried out, where harmonic and volume parameterization were used as the diffeomorphism for 2D and 3D domains. The DNO framework demonstrated robust learning capabilities and strong generalization performance across various domains and parameters.
Related papers
- Graph Fourier Neural Kernels (G-FuNK): Learning Solutions of Nonlinear Diffusive Parametric PDEs on Multiple Domains [2.8780581594260926]
We introduce a novel family of neural operators based on our Graph Fourier Neural Kernels.
G-FuNK combines components that are parameter- and domain-adapted with others that are not.
Experiments show G-FuNK's capability to accurately approximate heat, reaction diffusion, and cardiac electrophysiology equations.
arXiv Detail & Related papers (2024-10-06T23:55:34Z) - Physics-Informed Geometry-Aware Neural Operator [1.2430809884830318]
Engineering design problems often involve solving parametric Partial Differential Equations (PDEs) under variable PDE parameters and domain geometry.
Recent neural operators have shown promise in learning PDE operators and quickly predicting the PDE solutions.
We introduce a novel method, the Physics-Informed Geometry-Aware Neural Operator (PI-GANO), designed to simultaneously generalize across both PDE parameters and domain geometries.
arXiv Detail & Related papers (2024-08-02T23:11:42Z) - Physics-informed Discretization-independent Deep Compositional Operator Network [1.2430809884830318]
We introduce a novel physics-informed model architecture which can generalize to various discrete representations of PDE parameters and irregular domain shapes.
Inspired by deep operator neural networks, our model involves a discretization-independent learning of parameter embedding repeatedly.
Numerical results demonstrate the accuracy and efficiency of the proposed method.
arXiv Detail & Related papers (2024-04-21T12:41:30Z) - Pretraining Codomain Attention Neural Operators for Solving Multiphysics PDEs [85.40198664108624]
We propose Codomain Attention Neural Operator (CoDA-NO) to solve multiphysics problems with PDEs.
CoDA-NO tokenizes functions along the codomain or channel space, enabling self-supervised learning or pretraining of multiple PDE systems.
We find CoDA-NO to outperform existing methods by over 36% on complex downstream tasks with limited data.
arXiv Detail & Related papers (2024-03-19T08:56:20Z) - Neural Operators for Accelerating Scientific Simulations and Design [85.89660065887956]
An AI framework, known as Neural Operators, presents a principled framework for learning mappings between functions defined on continuous domains.
Neural Operators can augment or even replace existing simulators in many applications, such as computational fluid dynamics, weather forecasting, and material modeling.
arXiv Detail & Related papers (2023-09-27T00:12:07Z) - Unsupervised Legendre-Galerkin Neural Network for Stiff Partial
Differential Equations [9.659504024299896]
We propose an unsupervised machine learning algorithm based on the Legendre-Galerkin neural network to find an accurate approximation to the solution of different types of PDEs.
The proposed neural network is applied to the general 1D and 2D PDEs as well as singularly perturbed PDEs that possess boundary layer behavior.
arXiv Detail & Related papers (2022-07-21T00:47:47Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z) - One-shot learning for solution operators of partial differential equations [3.559034814756831]
Learning and solving governing equations of a physical system, represented by partial differential equations (PDEs), from data is a central challenge in a variety of areas of science and engineering.
Traditional numerical methods for solving PDEs can be computationally expensive for complex systems and require the complete PDEs of the physical system.
Here, we propose the first solution operator learning method that only requires one PDE solution, i.e., one-shot learning.
arXiv Detail & Related papers (2021-04-06T17:35:10Z) - Neural Operator: Graph Kernel Network for Partial Differential Equations [57.90284928158383]
This work is to generalize neural networks so that they can learn mappings between infinite-dimensional spaces (operators)
We formulate approximation of the infinite-dimensional mapping by composing nonlinear activation functions and a class of integral operators.
Experiments confirm that the proposed graph kernel network does have the desired properties and show competitive performance compared to the state of the art solvers.
arXiv Detail & Related papers (2020-03-07T01:56:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.