BENO: Boundary-embedded Neural Operators for Elliptic PDEs
- URL: http://arxiv.org/abs/2401.09323v1
- Date: Wed, 17 Jan 2024 16:47:39 GMT
- Title: BENO: Boundary-embedded Neural Operators for Elliptic PDEs
- Authors: Haixin Wang, Jiaxin Li, Anubhav Dwivedi, Kentaro Hara, Tailin Wu
- Abstract summary: We introduce Boundary-Embedded Neural Operators (BENO) for solving elliptic PDEs.
BENO embeds the complex geometries and inhomogeneous boundary values into the solving of elliptic PDEs.
Our model outperforms state-of-the-art neural operators and strong baselines by an average of 60.96%.
- Score: 15.18712698704595
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Elliptic partial differential equations (PDEs) are a major class of
time-independent PDEs that play a key role in many scientific and engineering
domains such as fluid dynamics, plasma physics, and solid mechanics. Recently,
neural operators have emerged as a promising technique to solve elliptic PDEs
more efficiently by directly mapping the input to solutions. However, existing
networks typically cannot handle complex geometries and inhomogeneous boundary
values present in the real world. Here we introduce Boundary-Embedded Neural
Operators (BENO), a novel neural operator architecture that embeds the complex
geometries and inhomogeneous boundary values into the solving of elliptic PDEs.
Inspired by classical Green's function, BENO consists of two branches of Graph
Neural Networks (GNNs) for interior source term and boundary values,
respectively. Furthermore, a Transformer encoder maps the global boundary
geometry into a latent vector which influences each message passing layer of
the GNNs. We test our model extensively in elliptic PDEs with various boundary
conditions. We show that all existing baseline methods fail to learn the
solution operator. In contrast, our model, endowed with boundary-embedded
architecture, outperforms state-of-the-art neural operators and strong
baselines by an average of 60.96\%. Our source code can be found
https://github.com/AI4Science-WestlakeU/beno.git.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Improving PINNs By Algebraic Inclusion of Boundary and Initial Conditions [0.1874930567916036]
"AI for Science" aims to solve fundamental scientific problems using AI techniques.
In this work we explore the possibility of changing the model being trained from being just a neural network to being a non-linear transformation of it.
This reduces the number of terms in the loss function than the standard PINN losses.
arXiv Detail & Related papers (2024-07-30T11:19:48Z) - Learning the boundary-to-domain mapping using Lifting Product Fourier Neural Operators for partial differential equations [5.5927988408828755]
We present a novel FNO-based architecture, named Lifting Product FNO (or LP-FNO) which can map arbitrary boundary functions to a solution in the entire domain.
We demonstrate the efficacy and resolution independence of the proposed LP-FNO for the 2D Poisson equation.
arXiv Detail & Related papers (2024-06-24T15:45:37Z) - Learning Only On Boundaries: a Physics-Informed Neural operator for
Solving Parametric Partial Differential Equations in Complex Geometries [10.250994619846416]
We present a novel physics-informed neural operator method to solve parametrized boundary value problems without labeled data.
Our numerical experiments show the effectiveness of parametrized complex geometries and unbounded problems.
arXiv Detail & Related papers (2023-08-24T17:29:57Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Fourier Neural Operator with Learned Deformations for PDEs on General Geometries [75.91055304134258]
We propose a new framework, viz., geo-FNO, to solve PDEs on arbitrary geometries.
Geo-FNO learns to deform the input (physical) domain, which may be irregular, into a latent space with a uniform grid.
We consider a variety of PDEs such as the Elasticity, Plasticity, Euler's, and Navier-Stokes equations, and both forward modeling and inverse design problems.
arXiv Detail & Related papers (2022-07-11T21:55:47Z) - Physics-Embedded Neural Networks: Graph Neural PDE Solvers with Mixed
Boundary Conditions [3.04585143845864]
Graph neural network (GNN) is a promising approach to learning and predicting physical phenomena.
We present a physics-embedded GNN that considers boundary conditions and predicts the state after a long time.
Our model can be a useful standard for realizing reliable, fast, and accurate GNN-based PDE solvers.
arXiv Detail & Related papers (2022-05-24T09:17:27Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Neural Operator: Graph Kernel Network for Partial Differential Equations [57.90284928158383]
This work is to generalize neural networks so that they can learn mappings between infinite-dimensional spaces (operators)
We formulate approximation of the infinite-dimensional mapping by composing nonlinear activation functions and a class of integral operators.
Experiments confirm that the proposed graph kernel network does have the desired properties and show competitive performance compared to the state of the art solvers.
arXiv Detail & Related papers (2020-03-07T01:56:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.