Guiding continuous operator learning through Physics-based boundary
constraints
- URL: http://arxiv.org/abs/2212.07477v1
- Date: Wed, 14 Dec 2022 19:54:46 GMT
- Title: Guiding continuous operator learning through Physics-based boundary
constraints
- Authors: Nadim Saad, Gaurav Gupta, Shima Alizadeh, Danielle C. Maddix
- Abstract summary: Boundary conditions (BCs) are physics-enforced constraints necessary for solutions of Partial Differential Equations (PDEs)
Current neural-network based approaches that aim to solve PDEs rely only on training data to help the model learn BCs implicitly.
We propose Boundary enforcing Operator Network (BOON) that enables the BC satisfaction of neural operators by making structural changes to the operator kernel.
- Score: 1.5847814664948012
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Boundary conditions (BCs) are important groups of physics-enforced
constraints that are necessary for solutions of Partial Differential Equations
(PDEs) to satisfy at specific spatial locations. These constraints carry
important physical meaning, and guarantee the existence and the uniqueness of
the PDE solution. Current neural-network based approaches that aim to solve
PDEs rely only on training data to help the model learn BCs implicitly. There
is no guarantee of BC satisfaction by these models during evaluation. In this
work, we propose Boundary enforcing Operator Network (BOON) that enables the BC
satisfaction of neural operators by making structural changes to the operator
kernel. We provide our refinement procedure, and demonstrate the satisfaction
of physics-based BCs, e.g. Dirichlet, Neumann, and periodic by the solutions
obtained by BOON. Numerical experiments based on multiple PDEs with a wide
variety of applications indicate that the proposed approach ensures
satisfaction of BCs, and leads to more accurate solutions over the entire
domain. The proposed correction method exhibits a (2X-20X) improvement over a
given operator model in relative $L^2$ error (0.000084 relative $L^2$ error for
Burgers' equation).
Related papers
- One Operator to Rule Them All? On Boundary-Indexed Operator Families in Neural PDE Solvers [0.0]
We show that standard neural operator training implicitly learns a boundary-indexed family of operators, rather than a single boundary-agnostic operator.<n>We formalize this perspective by framing operator learning as conditional risk minimization over boundary conditions.
arXiv Detail & Related papers (2026-03-02T03:15:00Z) - CompNO: A Novel Foundation Model approach for solving Partial Differential Equations [0.0]
Partial differential equations govern a wide range of physical phenomena, but their numerical solution remains computationally demanding.<n>Recent Scientific Foundation Models (SFMs) aim to alleviate this cost by learning universal surrogates from large collections of simulated systems.<n>We introduce Compositional Neural Operators (CompNO), a compositional neural operator framework for parametric PDEs.
arXiv Detail & Related papers (2026-01-12T10:04:48Z) - Scale-Consistent Learning for Partial Differential Equations [79.48661503591943]
We propose a data augmentation scheme based on scale-consistency properties of PDEs.<n>We then design a scale-informed neural operator that can model a wide range of scales.<n>With scale-consistency, the model trained on $Re$ of 1000 can generalize to $Re$ ranging from 250 to 10000.
arXiv Detail & Related papers (2025-07-24T21:29:52Z) - Physics-Informed Deep B-Spline Networks [4.593829882136678]
We propose physics-informed deep B-spline networks for learning partial differential equations.<n>B-spline networks approximate a family of PDEs with different parameters and ICBCs by learning B-spline control points through neural networks.<n>We show that B-spline networks are universal approximators for such families under mild conditions.
arXiv Detail & Related papers (2025-03-21T01:15:40Z) - Physics-Informed Deep Inverse Operator Networks for Solving PDE Inverse Problems [1.9490282165104331]
Inverse problems involving partial differential equations (PDEs) can be seen as discovering a mapping from measurement data to unknown quantities.
Existing methods typically rely on large amounts of labeled training data, which is impractical for most real-world applications.
We propose a novel architecture called Physics-Informed Deep Inverse Operator Networks (PI-DIONs) which can learn the solution operator of PDE-based inverse problems without labeled training data.
arXiv Detail & Related papers (2024-12-04T09:38:58Z) - Extremization to Fine Tune Physics Informed Neural Networks for Solving Boundary Value Problems [0.1874930567916036]
Theory of Functional Connections (TFC) is used to exactly impose initial and boundary conditions (IBCs) of (I)BVPs on PINNs.
We propose a modification to the TFC framework named Reduced TFC and show a significant improvement in the training and inference time of PINNs.
arXiv Detail & Related papers (2024-06-07T23:25:13Z) - Pretraining Codomain Attention Neural Operators for Solving Multiphysics PDEs [85.40198664108624]
We propose Codomain Attention Neural Operator (CoDA-NO) to solve multiphysics problems with PDEs.
CoDA-NO tokenizes functions along the codomain or channel space, enabling self-supervised learning or pretraining of multiple PDE systems.
We find CoDA-NO to outperform existing methods by over 36% on complex downstream tasks with limited data.
arXiv Detail & Related papers (2024-03-19T08:56:20Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Learning Only On Boundaries: a Physics-Informed Neural operator for
Solving Parametric Partial Differential Equations in Complex Geometries [10.250994619846416]
We present a novel physics-informed neural operator method to solve parametrized boundary value problems without labeled data.
Our numerical experiments show the effectiveness of parametrized complex geometries and unbounded problems.
arXiv Detail & Related papers (2023-08-24T17:29:57Z) - A Deep Learning Framework for Solving Hyperbolic Partial Differential
Equations: Part I [0.0]
This research focuses on the development of a physics informed deep learning framework to approximate solutions to nonlinear PDEs.
The framework naturally handles imposition of boundary conditions (Neumann/Dirichlet), entropy conditions, and regularity requirements.
arXiv Detail & Related papers (2023-07-09T08:27:17Z) - Koopman neural operator as a mesh-free solver of non-linear partial differential equations [15.410070455154138]
We propose the Koopman neural operator (KNO), a new neural operator, to overcome these challenges.
By approximating the Koopman operator, an infinite-dimensional operator governing all possible observations of the dynamic system, we can equivalently learn the solution of a non-linear PDE family.
The KNO exhibits notable advantages compared with previous state-of-the-art models.
arXiv Detail & Related papers (2023-01-24T14:10:15Z) - Neural Basis Functions for Accelerating Solutions to High Mach Euler
Equations [63.8376359764052]
We propose an approach to solving partial differential equations (PDEs) using a set of neural networks.
We regress a set of neural networks onto a reduced order Proper Orthogonal Decomposition (POD) basis.
These networks are then used in combination with a branch network that ingests the parameters of the prescribed PDE to compute a reduced order approximation to the PDE.
arXiv Detail & Related papers (2022-08-02T18:27:13Z) - Learning differentiable solvers for systems with hard constraints [48.54197776363251]
We introduce a practical method to enforce partial differential equation (PDE) constraints for functions defined by neural networks (NNs)
We develop a differentiable PDE-constrained layer that can be incorporated into any NN architecture.
Our results show that incorporating hard constraints directly into the NN architecture achieves much lower test error when compared to training on an unconstrained objective.
arXiv Detail & Related papers (2022-07-18T15:11:43Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z) - Bayesian neural networks for weak solution of PDEs with uncertainty
quantification [3.4773470589069473]
A new physics-constrained neural network (NN) approach is proposed to solve PDEs without labels.
We write the loss function of NNs based on the discretized residual of PDEs through an efficient, convolutional operator-based, and vectorized implementation.
We demonstrate the capability and performance of the proposed framework by applying it to steady-state diffusion, linear elasticity, and nonlinear elasticity.
arXiv Detail & Related papers (2021-01-13T04:57:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.