Learning the boundary-to-domain mapping using Lifting Product Fourier Neural Operators for partial differential equations
- URL: http://arxiv.org/abs/2406.16740v2
- Date: Mon, 1 Jul 2024 15:27:50 GMT
- Title: Learning the boundary-to-domain mapping using Lifting Product Fourier Neural Operators for partial differential equations
- Authors: Aditya Kashi, Arka Daw, Muralikrishnan Gopalakrishnan Meena, Hao Lu,
- Abstract summary: We present a novel FNO-based architecture, named Lifting Product FNO (or LP-FNO) which can map arbitrary boundary functions to a solution in the entire domain.
We demonstrate the efficacy and resolution independence of the proposed LP-FNO for the 2D Poisson equation.
- Score: 5.5927988408828755
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Neural operators such as the Fourier Neural Operator (FNO) have been shown to provide resolution-independent deep learning models that can learn mappings between function spaces. For example, an initial condition can be mapped to the solution of a partial differential equation (PDE) at a future time-step using a neural operator. Despite the popularity of neural operators, their use to predict solution functions over a domain given only data over the boundary (such as a spatially varying Dirichlet boundary condition) remains unexplored. In this paper, we refer to such problems as boundary-to-domain problems; they have a wide range of applications in areas such as fluid mechanics, solid mechanics, heat transfer etc. We present a novel FNO-based architecture, named Lifting Product FNO (or LP-FNO) which can map arbitrary boundary functions defined on the lower-dimensional boundary to a solution in the entire domain. Specifically, two FNOs defined on the lower-dimensional boundary are lifted into the higher dimensional domain using our proposed lifting product layer. We demonstrate the efficacy and resolution independence of the proposed LP-FNO for the 2D Poisson equation.
Related papers
- Non-overlapping, Schwarz-type Domain Decomposition Method for Physics and Equality Constrained Artificial Neural Networks [0.24578723416255746]
We present a non-overlapping, Schwarz-type domain decomposition method with a generalized interface condition.
Our approach employs physics and equality-constrained artificial neural networks (PECANN) within each subdomain.
A distinct advantage our domain decomposition method is its ability to learn solutions to both Poisson's and Helmholtz equations.
arXiv Detail & Related papers (2024-09-20T16:48:55Z) - Neural Operators with Localized Integral and Differential Kernels [77.76991758980003]
We present a principled approach to operator learning that can capture local features under two frameworks.
We prove that we obtain differential operators under an appropriate scaling of the kernel values of CNNs.
To obtain local integral operators, we utilize suitable basis representations for the kernels based on discrete-continuous convolutions.
arXiv Detail & Related papers (2024-02-26T18:59:31Z) - Multi-Grid Tensorized Fourier Neural Operator for High-Resolution PDEs [93.82811501035569]
We introduce a new data efficient and highly parallelizable operator learning approach with reduced memory requirement and better generalization.
MG-TFNO scales to large resolutions by leveraging local and global structures of full-scale, real-world phenomena.
We demonstrate superior performance on the turbulent Navier-Stokes equations where we achieve less than half the error with over 150x compression.
arXiv Detail & Related papers (2023-09-29T20:18:52Z) - Learning Only On Boundaries: a Physics-Informed Neural operator for
Solving Parametric Partial Differential Equations in Complex Geometries [10.250994619846416]
We present a novel physics-informed neural operator method to solve parametrized boundary value problems without labeled data.
Our numerical experiments show the effectiveness of parametrized complex geometries and unbounded problems.
arXiv Detail & Related papers (2023-08-24T17:29:57Z) - Fourier Neural Operator with Learned Deformations for PDEs on General Geometries [75.91055304134258]
We propose a new framework, viz., geo-FNO, to solve PDEs on arbitrary geometries.
Geo-FNO learns to deform the input (physical) domain, which may be irregular, into a latent space with a uniform grid.
We consider a variety of PDEs such as the Elasticity, Plasticity, Euler's, and Navier-Stokes equations, and both forward modeling and inverse design problems.
arXiv Detail & Related papers (2022-07-11T21:55:47Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Train Once and Use Forever: Solving Boundary Value Problems in Unseen
Domains with Pre-trained Deep Learning Models [0.20999222360659606]
This paper introduces a transferable framework for solving boundary value problems (BVPs) via deep neural networks.
First, we introduce emphgenomic flow network (GFNet), a neural network that can infer the solution of a BVP across arbitrary boundary conditions.
Then, we propose emphmosaic flow (MF) predictor, a novel iterative algorithm that assembles or stitches the GFNet's inferences.
arXiv Detail & Related papers (2021-04-22T05:20:27Z) - Exact imposition of boundary conditions with distance functions in
physics-informed deep neural networks [0.5804039129951741]
We introduce geometry-aware trial functions in artifical neural networks to improve the training in deep learning for partial differential equations.
To exactly impose homogeneous Dirichlet boundary conditions, the trial function is taken as $phi$ multiplied by the PINN approximation.
We present numerical solutions for linear and nonlinear boundary-value problems over domains with affine and curved boundaries.
arXiv Detail & Related papers (2021-04-17T03:02:52Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z) - A nonlocal physics-informed deep learning framework using the
peridynamic differential operator [0.0]
We develop a nonlocal PINN approach using the Peridynamic Differential Operator (PDDO)---a numerical method which incorporates long-range interactions and removes spatial derivatives in the governing equations.
Because the PDDO functions can be readily incorporated in the neural network architecture, the nonlocality does not degrade the performance of modern deep-learning algorithms.
We document the superior behavior of nonlocal PINN with respect to local PINN in both solution accuracy and parameter inference.
arXiv Detail & Related papers (2020-05-31T06:26:21Z) - Neural Operator: Graph Kernel Network for Partial Differential Equations [57.90284928158383]
This work is to generalize neural networks so that they can learn mappings between infinite-dimensional spaces (operators)
We formulate approximation of the infinite-dimensional mapping by composing nonlinear activation functions and a class of integral operators.
Experiments confirm that the proposed graph kernel network does have the desired properties and show competitive performance compared to the state of the art solvers.
arXiv Detail & Related papers (2020-03-07T01:56:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.