One Operator to Rule Them All? On Boundary-Indexed Operator Families in Neural PDE Solvers
- URL: http://arxiv.org/abs/2603.01406v1
- Date: Mon, 02 Mar 2026 03:15:00 GMT
- Title: One Operator to Rule Them All? On Boundary-Indexed Operator Families in Neural PDE Solvers
- Authors: Lennon J. Shikhman,
- Abstract summary: We show that standard neural operator training implicitly learns a boundary-indexed family of operators, rather than a single boundary-agnostic operator.<n>We formalize this perspective by framing operator learning as conditional risk minimization over boundary conditions.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural PDE solvers are often described as learning solution operators that map problem data to PDE solutions. In this work, we argue that this interpretation is generally incorrect when boundary conditions vary. We show that standard neural operator training implicitly learns a boundary-indexed family of operators, rather than a single boundary-agnostic operator, with the learned mapping fundamentally conditioned on the boundary-condition distribution seen during training. We formalize this perspective by framing operator learning as conditional risk minimization over boundary conditions, which leads to a non-identifiability result outside the support of the training boundary distribution. As a consequence, generalization in forcing terms or resolution does not imply generalization across boundary conditions. We support our theoretical analysis with controlled experiments on the Poisson equation, demonstrating sharp degradation under boundary-condition shifts, cross-distribution failures between distinct boundary ensembles, and convergence to conditional expectations when boundary information is removed. Our results clarify a core limitation of current neural PDE solvers and highlight the need for explicit boundary-aware modeling in the pursuit of foundation models for PDEs.
Related papers
- CompNO: A Novel Foundation Model approach for solving Partial Differential Equations [0.0]
Partial differential equations govern a wide range of physical phenomena, but their numerical solution remains computationally demanding.<n>Recent Scientific Foundation Models (SFMs) aim to alleviate this cost by learning universal surrogates from large collections of simulated systems.<n>We introduce Compositional Neural Operators (CompNO), a compositional neural operator framework for parametric PDEs.
arXiv Detail & Related papers (2026-01-12T10:04:48Z) - Enforcing boundary conditions for physics-informed neural operators [0.3058685580689604]
Machine-learning based methods like physics-informed neural networks are becoming increasingly adept at solving even complex systems of partial differential equations.<n> Boundary conditions can be enforced either weakly by penalizing deviations in the loss function or strongly by training a solution structure that inherently matches the prescribed values and derivatives.
arXiv Detail & Related papers (2025-10-28T15:51:48Z) - BEKAN: Boundary condition-guaranteed evolutionary Kolmogorov-Arnold networks with radial basis functions for solving PDE problems [11.258825397319143]
We propose a boundary condition-guaranteed evolutionary Kolmogorov-Arnold Network (KAN) with radial basis functions (BEKAN)<n>In BEKAN, we propose three distinct approaches for incorporating Dirichlet, periodic, and Neumann boundary conditions into the network.<n>By virtue of the boundary-embedded RBFs, the periodic layer, and the evolutionary framework, we can perform accurate PDE simulations while rigorously enforcing boundary conditions.
arXiv Detail & Related papers (2025-10-03T23:57:23Z) - Physics-Informed Deep B-Spline Networks [4.593829882136678]
We propose physics-informed deep B-spline networks for learning partial differential equations.<n>B-spline networks approximate a family of PDEs with different parameters and ICBCs by learning B-spline control points through neural networks.<n>We show that B-spline networks are universal approximators for such families under mild conditions.
arXiv Detail & Related papers (2025-03-21T01:15:40Z) - Safe PDE Boundary Control with Neural Operators [7.537923263907072]
We introduce a neural boundary control barrier function (BCBF) to ensure the feasibility of the trajectory-wise constraint satisfaction of boundary output.<n>Experiments under challenging hyperbolic, parabolic and Navier-Stokes PDE dynamics environments validate the plug-and-play effectiveness of the proposed method.
arXiv Detail & Related papers (2024-11-23T20:15:51Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Guiding continuous operator learning through Physics-based boundary
constraints [1.5847814664948012]
Boundary conditions (BCs) are physics-enforced constraints necessary for solutions of Partial Differential Equations (PDEs)
Current neural-network based approaches that aim to solve PDEs rely only on training data to help the model learn BCs implicitly.
We propose Boundary enforcing Operator Network (BOON) that enables the BC satisfaction of neural operators by making structural changes to the operator kernel.
arXiv Detail & Related papers (2022-12-14T19:54:46Z) - On the Importance of Gradient Norm in PAC-Bayesian Bounds [92.82627080794491]
We propose a new generalization bound that exploits the contractivity of the log-Sobolev inequalities.
We empirically analyze the effect of this new loss-gradient norm term on different neural architectures.
arXiv Detail & Related papers (2022-10-12T12:49:20Z) - LordNet: An Efficient Neural Network for Learning to Solve Parametric Partial Differential Equations without Simulated Data [47.49194807524502]
We propose LordNet, a tunable and efficient neural network for modeling entanglements.
The experiments on solving Poisson's equation and (2D and 3D) Navier-Stokes equation demonstrate that the long-range entanglements can be well modeled by the LordNet.
arXiv Detail & Related papers (2022-06-19T14:41:08Z) - Zero Pixel Directional Boundary by Vector Transform [77.63061686394038]
We re-interpret boundaries as 1-D surfaces and formulate a one-to-one vector transform function that allows for training of boundary prediction completely avoiding the class imbalance issue.
Our problem formulation leads to the estimation of direction as well as richer contextual information of the boundary, and, if desired, the availability of zero-pixel thin boundaries also at training time.
arXiv Detail & Related papers (2022-03-16T17:55:31Z) - Optimal variance-reduced stochastic approximation in Banach spaces [114.8734960258221]
We study the problem of estimating the fixed point of a contractive operator defined on a separable Banach space.
We establish non-asymptotic bounds for both the operator defect and the estimation error.
arXiv Detail & Related papers (2022-01-21T02:46:57Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Active Boundary Loss for Semantic Segmentation [58.72057610093194]
This paper proposes a novel active boundary loss for semantic segmentation.
It can progressively encourage the alignment between predicted boundaries and ground-truth boundaries during end-to-end training.
Experimental results show that training with the active boundary loss can effectively improve the boundary F-score and mean Intersection-over-Union.
arXiv Detail & Related papers (2021-02-04T15:47:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.