Statistical Learning Analysis of Physics-Informed Neural Networks
- URL: http://arxiv.org/abs/2602.11097v1
- Date: Wed, 11 Feb 2026 18:09:29 GMT
- Title: Statistical Learning Analysis of Physics-Informed Neural Networks
- Authors: David A. Barajas-Solano,
- Abstract summary: We study the training and performance of physics-informed learning for initial and boundary value problems with physics-informed neural networks (PINNs)<n>We use the so-called Local Learning Coefficient to analyze the estimates of PINN parameters obtained via optimization for a heat equation.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the training and performance of physics-informed learning for initial and boundary value problems (IBVP) with physics-informed neural networks (PINNs) from a statistical learning perspective. Specifically, we restrict ourselves to parameterizations with hard initial and boundary condition constraints and reformulate the problem of estimating PINN parameters as a statistical learning problem. From this perspective, the physics penalty on the IBVP residuals can be better understood not as a regularizing term bus as an infinite source of indirect data, and the learning process as fitting the PINN distribution of residuals $p(y \mid x, t, w) q(x, t) $ to the true data-generating distribution $δ(0) q(x, t)$ by minimizing the Kullback-Leibler divergence between the true and PINN distributions. Furthermore, this analysis show that physics-informed learning with PINNs is a singular learning problem, and we employ singular learning theory tools, namely the so-called Local Learning Coefficient (Lau et al., 2025) to analyze the estimates of PINN parameters obtained via stochastic optimization for a heat equation IBVP. Finally, we discuss implications of this analysis on the quantification of predictive uncertainty of PINNs and the extrapolation capacity of PINNs.
Related papers
- On the Role of Consistency Between Physics and Data in Physics-Informed Neural Networks [2.9455202926636175]
Physics-informed neural networks (PINNs) have gained significant attention as a surrogate modeling strategy for partial differential equations (PDEs)<n>PINNs are frequently trained using experimental or numerical data that are not fully consistent with the governing equations due to measurement noise, discretization errors, or modeling assumptions.<n>We analyze how data inconsistency limits the attainable accuracy of PINNs.
arXiv Detail & Related papers (2026-02-11T08:00:53Z) - Prediction error certification for PINNs: Theory, computation, and application to Stokes flow [0.0]
Rigorous error estimation is a fundamental topic in numerical analysis.<n>With the increasing use of physics-informed neural networks (PINNs) for solving partial differential equations, several approaches have been developed to quantify the associated prediction error.<n>We build upon a semigroup-based framework previously introduced by the authors for estimating the PINN error.
arXiv Detail & Related papers (2025-08-11T13:57:02Z) - PINNverse: Accurate parameter estimation in differential equations from noisy data with constrained physics-informed neural networks [0.0]
Physics-Informed Neural Networks (PINNs) have emerged as effective tools for solving such problems.<n>We introduce PINNverse, a training paradigm that addresses these limitations by reformulating the learning process as a constrained differential optimization problem.<n>We demonstrate robust and accurate parameter estimation from noisy data in four classical ODE and PDE models from physics and biology.
arXiv Detail & Related papers (2025-04-07T16:34:57Z) - ProPINN: Demystifying Propagation Failures in Physics-Informed Neural Networks [71.02216400133858]
Physics-informed neural networks (PINNs) have earned high expectations in solving partial differential equations (PDEs)<n>Previous research observed the propagation failure phenomenon of PINNs.<n>This paper provides a formal and in-depth study of propagation failure and its root cause.
arXiv Detail & Related papers (2025-02-02T13:56:38Z) - Conformalized Physics-Informed Neural Networks [0.8437187555622164]
We introduce Conformalized PINNs (C-PINNs) to quantify the uncertainty of PINNs.
C-PINNs utilize the framework of conformal prediction to quantify the uncertainty of PINNs.
arXiv Detail & Related papers (2024-05-13T18:45:25Z) - coVariance Neural Networks [119.45320143101381]
Graph neural networks (GNN) are an effective framework that exploit inter-relationships within graph-structured data for learning.
We propose a GNN architecture, called coVariance neural network (VNN), that operates on sample covariance matrices as graphs.
We show that VNN performance is indeed more stable than PCA-based statistical approaches.
arXiv Detail & Related papers (2022-05-31T15:04:43Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Robust Learning of Physics Informed Neural Networks [2.86989372262348]
Physics-informed Neural Networks (PINNs) have been shown to be effective in solving partial differential equations.
This paper shows that a PINN can be sensitive to errors in training data and overfit itself in dynamically propagating these errors over the domain of the solution of the PDE.
arXiv Detail & Related papers (2021-10-26T00:10:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.