On the Role of Consistency Between Physics and Data in Physics-Informed Neural Networks
- URL: http://arxiv.org/abs/2602.10611v1
- Date: Wed, 11 Feb 2026 08:00:53 GMT
- Title: On the Role of Consistency Between Physics and Data in Physics-Informed Neural Networks
- Authors: Nicolás Becerra-Zuniga, Lucas Lacasa, Eusebio Valero, Gonzalo Rubio,
- Abstract summary: Physics-informed neural networks (PINNs) have gained significant attention as a surrogate modeling strategy for partial differential equations (PDEs)<n>PINNs are frequently trained using experimental or numerical data that are not fully consistent with the governing equations due to measurement noise, discretization errors, or modeling assumptions.<n>We analyze how data inconsistency limits the attainable accuracy of PINNs.
- Score: 2.9455202926636175
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Physics-informed neural networks (PINNs) have gained significant attention as a surrogate modeling strategy for partial differential equations (PDEs), particularly in regimes where labeled data are scarce and physical constraints can be leveraged to regularize the learning process. In practice, however, PINNs are frequently trained using experimental or numerical data that are not fully consistent with the governing equations due to measurement noise, discretization errors, or modeling assumptions. The implications of such data-to-PDE inconsistencies on the accuracy and convergence of PINNs remain insufficiently understood. In this work, we systematically analyze how data inconsistency fundamentally limits the attainable accuracy of PINNs. We introduce the concept of a consistency barrier, defined as an intrinsic lower bound on the error that arises from mismatches between the fidelity of the data and the exact enforcement of the PDE residual. To isolate and quantify this effect, we consider the 1D viscous Burgers equation with a manufactured analytical solution, which enables full control over data fidelity and residual errors. PINNs are trained using datasets of progressively increasing numerical accuracy, as well as perfectly consistent analytical data. Results show that while the inclusion of the PDE residual allows PINNs to partially mitigate low-fidelity data and recover the dominant physical structure, the training process ultimately saturates at an error level dictated by the data inconsistency. When high-fidelity numerical data are employed, PINN solutions become indistinguishable from those trained on analytical data, indicating that the consistency barrier is effectively removed. These findings clarify the interplay between data quality and physics enforcement in PINNs providing practical guidance for the construction and interpretation of physics-informed surrogate models.
Related papers
- Unlearning Noise in PINNs: A Selective Pruning Framework for PDE Inverse Problems [11.570920716097552]
Physics-informed neural networks (PINNs) provide a promising framework for solving inverse problems governed by partial differential equations (PDEs)<n>The ill-posed nature of PDE inverse problems makes them highly sensitive to noise.<n>We propose P-PINN, a selective pruning framework designed to unlearn the influence of corrupted data in a pretrained PINN.
arXiv Detail & Related papers (2026-02-23T15:29:50Z) - PINNverse: Accurate parameter estimation in differential equations from noisy data with constrained physics-informed neural networks [0.0]
Physics-Informed Neural Networks (PINNs) have emerged as effective tools for solving such problems.<n>We introduce PINNverse, a training paradigm that addresses these limitations by reformulating the learning process as a constrained differential optimization problem.<n>We demonstrate robust and accurate parameter estimation from noisy data in four classical ODE and PDE models from physics and biology.
arXiv Detail & Related papers (2025-04-07T16:34:57Z) - Paving the way for scientific foundation models: enhancing generalization and robustness in PDEs with constraint-aware pre-training [49.8035317670223]
A scientific foundation model (SciFM) is emerging as a promising tool for learning transferable representations across diverse domains.<n>We propose incorporating PDE residuals into pre-training either as the sole learning signal or in combination with data loss to compensate for limited or infeasible training data.<n>Our results show that pre-training with PDE constraints significantly enhances generalization, outperforming models trained solely on solution data.
arXiv Detail & Related papers (2025-03-24T19:12:39Z) - Solving Oscillator Ordinary Differential Equations in the Time Domain with High Performance via Soft-constrained Physics-informed Neural Network with Small Data [0.6446246430600296]
Physics-informed neural network (PINN) incorporates physical information and knowledge into network topology or computational processes as model priors.<n>This study aims to investigate the performance characteristics of the soft-constrained PINN method to solve typical linear and nonlinear ordinary differential equations.
arXiv Detail & Related papers (2024-08-19T13:02:06Z) - DeltaPhi: Physical States Residual Learning for Neural Operators in Data-Limited PDE Solving [54.605760146540234]
DeltaPhi is a novel learning framework that transforms the PDE solving task from learning direct input-output mappings to learning the residuals between similar physical states.<n>Extensive experiments demonstrate consistent and significant improvements across diverse physical systems.
arXiv Detail & Related papers (2024-06-14T07:45:07Z) - Correcting model misspecification in physics-informed neural networks
(PINNs) [2.07180164747172]
We present a general approach to correct the misspecified physical models in PINNs for discovering governing equations.
We employ other deep neural networks (DNNs) to model the discrepancy between the imperfect models and the observational data.
We envision that the proposed approach will extend the applications of PINNs for discovering governing equations in problems where the physico-chemical or biological processes are not well understood.
arXiv Detail & Related papers (2023-10-16T19:25:52Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Robust Regression with Highly Corrupted Data via Physics Informed Neural
Networks [4.642273921499256]
Physics-informed neural networks (PINNs) have been proposed to solve two main classes of problems.
We show the generalizability, accuracy and efficiency of the proposed algorithms for recovering governing equations from noisy and corrupted measurement data.
arXiv Detail & Related papers (2022-10-19T15:21:05Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Robust Learning of Physics Informed Neural Networks [2.86989372262348]
Physics-informed Neural Networks (PINNs) have been shown to be effective in solving partial differential equations.
This paper shows that a PINN can be sensitive to errors in training data and overfit itself in dynamically propagating these errors over the domain of the solution of the PDE.
arXiv Detail & Related papers (2021-10-26T00:10:57Z) - HYDRA: Hypergradient Data Relevance Analysis for Interpreting Deep Neural Networks [63.02441733983715]
We propose Hypergradient Data Relevance Analysis, or HYDRA, which interprets predictions made by deep neural networks (DNNs) as effects of their training data.<n> HYDRA assesses the contribution of training data toward test data points throughout the training trajectory.<n>In addition, we quantitatively demonstrate that HYDRA outperforms influence functions in accurately estimating data contribution and detecting noisy data labels.
arXiv Detail & Related papers (2021-02-04T10:00:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.