Physics-Informed Deep B-Spline Networks
- URL: http://arxiv.org/abs/2503.16777v2
- Date: Sat, 18 Oct 2025 16:03:45 GMT
- Title: Physics-Informed Deep B-Spline Networks
- Authors: Zhuoyuan Wang, Raffaele Romagnoli, Saviz Mowlavi, Yorie Nakahira,
- Abstract summary: We propose physics-informed deep B-spline networks for learning partial differential equations.<n>B-spline networks approximate a family of PDEs with different parameters and ICBCs by learning B-spline control points through neural networks.<n>We show that B-spline networks are universal approximators for such families under mild conditions.
- Score: 4.593829882136678
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Physics-informed machine learning offers a promising framework for solving complex partial differential equations (PDEs) by integrating observational data with governing physical laws. However, learning PDEs with varying parameters and changing initial conditions and boundary conditions (ICBCs) with theoretical guarantees remains an open challenge. In this paper, we propose physics-informed deep B-spline networks, a novel technique that approximates a family of PDEs with different parameters and ICBCs by learning B-spline control points through neural networks. The proposed B-spline representation reduces the learning task from predicting solution values over the entire domain to learning a compact set of control points, enforces strict compliance to initial and Dirichlet boundary conditions by construction, and enables analytical computation of derivatives for incorporating PDE residual losses. While existing approximation and generalization theories are not applicable in this setting - where solutions of parametrized PDE families are represented via B-spline bases - we fill this gap by showing that B-spline networks are universal approximators for such families under mild conditions. We also derive generalization error bounds for physics-informed learning in both elliptic and parabolic PDE settings, establishing new theoretical guarantees. Finally, we demonstrate in experiments that the proposed technique has improved efficiency-accuracy tradeoffs compared to existing techniques in a dynamical system problem with discontinuous ICBCs and can handle nonhomogeneous ICBCs and non-rectangular domains.
Related papers
- BEACONS: Bounded-Error, Algebraically-Composable Neural Solvers for Partial Differential Equations [0.0]
We show how it is possible to circumvent limitations by constructing formally-verified neural network solvers for PDEs.<n>We show how it is possible to construct rigorous extrapolatory bounds on the worst-case Linf errors of shallow neural network approximations.<n>The resulting framework, called BEACONS, comprises both an automatic code-proving for the neural solvers themselves, as well as a bespoke automated theorem-generator system for producing machine-checkable certificates of correctness.
arXiv Detail & Related papers (2026-02-16T15:49:19Z) - Towards A Unified PAC-Bayesian Framework for Norm-based Generalization Bounds [63.47271262149291]
We propose a unified framework for PAC-Bayesian norm-based generalization.<n>The key to our approach is a sensitivity matrix that quantifies the network outputs with respect to structured weight perturbations.<n>We derive a family of generalization bounds that recover several existing PAC-Bayesian results as special cases.
arXiv Detail & Related papers (2026-01-13T00:42:22Z) - Advancing Generalization in PINNs through Latent-Space Representations [71.86401914779019]
Physics-informed neural networks (PINNs) have made significant strides in modeling dynamical systems governed by partial differential equations (PDEs)<n>We propose PIDO, a novel physics-informed neural PDE solver designed to generalize effectively across diverse PDE configurations.<n>We validate PIDO on a range of benchmarks, including 1D combined equations and 2D Navier-Stokes equations.
arXiv Detail & Related papers (2024-11-28T13:16:20Z) - Solving Differential Equations with Constrained Learning [8.522558872274276]
(Partial) differential equations (PDEs) are fundamental tools for describing natural phenomena, making their solution crucial in science and engineering.<n>Traditional methods, such as the finite element method, provide reliable solutions, but their accuracy is tied to the use of computationally intensive fine meshes.<n>This paper addresses these challenges by developing a science-constrained learning (SCL) framework.<n>It demonstrates that finding a (weak) solution of a PDE is equivalent to solving a constrained learning problem with worst-case losses.
arXiv Detail & Related papers (2024-10-30T08:20:39Z) - HyResPINNs: Hybrid Residual Networks for Adaptive Neural and RBF Integration in Solving PDEs [22.689531776611084]
We introduce HyResPINNs, a novel class of PINNs featuring adaptive hybrid residual blocks that integrate standard neural networks and radial basis function networks.<n>A distinguishing characteristic of HyResPINNs is the use of adaptive combination parameters within each residual block, enabling dynamic weighting of the neural and RBF network contributions.
arXiv Detail & Related papers (2024-10-04T16:21:14Z) - A Physics Informed Neural Network (PINN) Methodology for Coupled Moving Boundary PDEs [0.0]
Physics-Informed Neural Network (PINN) is a novel multi-task learning framework useful for solving physical problems modeled using differential equations (DEs)
This paper reports a PINN-based approach to solve coupled systems involving multiple governing parameters (energy and species, along with multiple interface balance equations)
arXiv Detail & Related papers (2024-09-17T06:00:18Z) - A Hybrid Kernel-Free Boundary Integral Method with Operator Learning for Solving Parametric Partial Differential Equations In Complex Domains [0.0]
Kernel-Free Boundary Integral (KFBI) method presents an iterative solution to boundary integral equations arising from elliptic partial differential equations (PDEs)
We propose a hybrid KFBI method, integrating the foundational principles of the KFBI method with the capabilities of deep learning.
arXiv Detail & Related papers (2024-04-23T17:25:35Z) - Pretraining Codomain Attention Neural Operators for Solving Multiphysics PDEs [85.40198664108624]
We propose Codomain Attention Neural Operator (CoDA-NO) to solve multiphysics problems with PDEs.
CoDA-NO tokenizes functions along the codomain or channel space, enabling self-supervised learning or pretraining of multiple PDE systems.
We find CoDA-NO to outperform existing methods by over 36% on complex downstream tasks with limited data.
arXiv Detail & Related papers (2024-03-19T08:56:20Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - A Deep Learning Framework for Solving Hyperbolic Partial Differential
Equations: Part I [0.0]
This research focuses on the development of a physics informed deep learning framework to approximate solutions to nonlinear PDEs.
The framework naturally handles imposition of boundary conditions (Neumann/Dirichlet), entropy conditions, and regularity requirements.
arXiv Detail & Related papers (2023-07-09T08:27:17Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Deep NURBS -- Admissible Physics-informed Neural Networks [0.0]
We propose a new numerical scheme for physics-informed neural networks (PINNs) that enables precise and inexpensive solution for partial differential equations (PDEs)
The proposed approach combines admissible NURBS parametrizations required to define the physical domain and the Dirichlet boundary conditions with a PINN solver.
arXiv Detail & Related papers (2022-10-25T10:35:45Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - A Unified Hard-Constraint Framework for Solving Geometrically Complex
PDEs [25.52271761404213]
We present a unified framework for solving geometrically complex PDEs with neural networks.
We first introduce the "extra fields" from the mixed finite element method to reformulate the PDEs.
We derive the general solutions of the BCs analytically, which are employed to construct an ansatz that automatically satisfies the BCs.
arXiv Detail & Related papers (2022-10-06T06:19:33Z) - Learning differentiable solvers for systems with hard constraints [48.54197776363251]
We introduce a practical method to enforce partial differential equation (PDE) constraints for functions defined by neural networks (NNs)
We develop a differentiable PDE-constrained layer that can be incorporated into any NN architecture.
Our results show that incorporating hard constraints directly into the NN architecture achieves much lower test error when compared to training on an unconstrained objective.
arXiv Detail & Related papers (2022-07-18T15:11:43Z) - Backpropagation at the Infinitesimal Inference Limit of Energy-Based
Models: Unifying Predictive Coding, Equilibrium Propagation, and Contrastive
Hebbian Learning [41.58529335439799]
How the brain performs credit assignment is a fundamental unsolved problem in neuroscience.
Many biologically plausible' algorithms have been proposed, which compute gradients that approximate those computed by backpropagation (BP)
arXiv Detail & Related papers (2022-05-31T20:48:52Z) - Multi-resolution partial differential equations preserved learning
framework for spatiotemporal dynamics [11.981731023317945]
Physics-informed deep learning (PiDL) addresses these challenges by incorporating physical principles into the model.
We propose to leverage physics prior knowledge by baking'' the discretized governing equations into the neural network architecture.
This method, embedding discretized PDEs through convolutional residual networks in a multi-resolution setting, largely improves the generalizability and long-term prediction.
arXiv Detail & Related papers (2022-05-09T01:27:58Z) - Physics-constrained Unsupervised Learning of Partial Differential
Equations using Meshes [1.066048003460524]
Graph neural networks show promise in accurately representing irregularly meshed objects and learning their dynamics.
In this work, we represent meshes naturally as graphs, process these using Graph Networks, and formulate our physics-based loss to provide an unsupervised learning framework for partial differential equations (PDE)
Our framework will enable the application of PDE solvers in interactive settings, such as model-based control of soft-body deformations.
arXiv Detail & Related papers (2022-03-30T19:22:56Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Physics informed neural networks for continuum micromechanics [68.8204255655161]
Recently, physics informed neural networks have successfully been applied to a broad variety of problems in applied mathematics and engineering.
Due to the global approximation, physics informed neural networks have difficulties in displaying localized effects and strong non-linear solutions by optimization.
It is shown, that the domain decomposition approach is able to accurately resolve nonlinear stress, displacement and energy fields in heterogeneous microstructures obtained from real-world $mu$CT-scans.
arXiv Detail & Related papers (2021-10-14T14:05:19Z) - Learning to Control PDEs with Differentiable Physics [102.36050646250871]
We present a novel hierarchical predictor-corrector scheme which enables neural networks to learn to understand and control complex nonlinear physical systems over long time frames.
We demonstrate that our method successfully develops an understanding of complex physical systems and learns to control them for tasks involving PDEs.
arXiv Detail & Related papers (2020-01-21T11:58:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.