Unsupervised Physics-Informed Operator Learning through Multi-Stage Curriculum Training
- URL: http://arxiv.org/abs/2602.02264v1
- Date: Mon, 02 Feb 2026 16:06:57 GMT
- Title: Unsupervised Physics-Informed Operator Learning through Multi-Stage Curriculum Training
- Authors: Paolo Marcandelli, Natansh Mathur, Stefano Markidis, Martina Siena, Stefano Mariani,
- Abstract summary: We introduce a physics-informed training strategy that achieves convergence by enforcing boundary conditions in the loss landscape.<n>At each stage the limitation is re-formed, acting as a continuation mechanism that restores stability and prevents stagnation.<n>Across canonical benchmarks, PhIS-FNO attains a level of accuracy comparable to that of supervised learning.
- Score: 1.5620806570871846
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Solving partial differential equations remains a central challenge in scientific machine learning. Neural operators offer a promising route by learning mappings between function spaces and enabling resolution-independent inference, yet they typically require supervised data. Physics-informed neural networks address this limitation through unsupervised training with physical constraints but often suffer from unstable convergence and limited generalization capability. To overcome these issues, we introduce a multi-stage physics-informed training strategy that achieves convergence by progressively enforcing boundary conditions in the loss landscape and subsequently incorporating interior residuals. At each stage the optimizer is re-initialized, acting as a continuation mechanism that restores stability and prevents gradient stagnation. We further propose the Physics-Informed Spline Fourier Neural Operator (PhIS-FNO), combining Fourier layers with Hermite spline kernels for smooth residual evaluation. Across canonical benchmarks, PhIS-FNO attains a level of accuracy comparable to that of supervised learning, using labeled information only along a narrow boundary region, establishing staged, spline-based optimization as a robust paradigm for physics-informed operator learning.
Related papers
- Tackling multiphysics problems via finite element-guided physics-informed operator learning [0.0]
This work presents a finite element-guided physics-informed operator learning framework for multiphysics problems.<n>The proposed framework learns a mapping from the input parameter space to the solution space with a weighted residual formulation based on the finite element method.<n>The present framework for multiphysics problems is verified on nonlinear thermo-mechanical problems.
arXiv Detail & Related papers (2026-03-02T03:52:51Z) - When Learning Hurts: Fixed-Pole RNN for Real-Time Online Training [58.25341036646294]
We analytically examine why learning recurrent poles does not provide tangible benefits in data and empirically offer real-time learning scenarios.<n>We show that fixed-pole networks achieve superior performance with lower training complexity, making them more suitable for online real-time tasks.
arXiv Detail & Related papers (2026-02-25T00:15:13Z) - Physics-Informed Laplace Neural Operator for Solving Partial Differential Equations [11.064132774859553]
Physics-Informed Laplace Neural Operator (PILNO) is a fast surrogate solver for partial differential equations.<n>It embeds physics into training through PDE, boundary condition, and initial condition residuals.<n>PILNO consistently improves accuracy in small-data settings, reduces run-to-run variability across random seeds, and achieves stronger generalization than purely data-driven baselines.
arXiv Detail & Related papers (2026-02-13T08:19:40Z) - Spectral Analysis of Hard-Constraint PINNs: The Spatial Modulation Mechanism of Boundary Functions [4.170072254495455]
This work reveals that the boundary function $B$ introduces a multiplicative spatial modulation that fundamentally alters the learning landscape.<n>A rigorous Neural Tangent Kernel (NTK) framework for HC-PINNs is established, deriving the explicit kernel composition law.<n>It is shown that widely used boundary functions can inadvertently induce spectral collapse, leading to optimization stagnation despite exact boundary satisfaction.
arXiv Detail & Related papers (2025-12-29T08:31:58Z) - Physics-informed Neural Operator Learning for Nonlinear Grad-Shafranov Equation [18.564353542797946]
In magnetic confinement nuclear fusion, rapid and accurate solution of the Grad-Shafranov equation (GSE) is essential for real-time plasma control and analysis.<n>Traditional numerical solvers achieve high precision but are computationally prohibitive, while data-driven surrogates infer quickly but fail to enforce physical laws and generalize poorly beyond training distributions.<n>We present a Physics-Informed Neural Operator (PINO) that directly learns the GSE solution operator, mapping shape parameters of last closed flux surface to equilibrium solutions for realistic nonlinear current profiles.
arXiv Detail & Related papers (2025-11-24T13:46:38Z) - PhysicsCorrect: A Training-Free Approach for Stable Neural PDE Simulations [4.7903561901859355]
We present PhysicsCorrect, a training-free correction framework that enforces PDE consistency at each prediction step.<n>Our key innovation is an efficient caching strategy that precomputes the Jacobian and its pseudoinverse during an offline warm-up phase.<n>Across three representative PDE systems, PhysicsCorrect reduces prediction errors by up to 100x while adding negligible inference time.
arXiv Detail & Related papers (2025-07-03T01:22:57Z) - DeltaPhi: Physical States Residual Learning for Neural Operators in Data-Limited PDE Solving [54.605760146540234]
DeltaPhi is a novel learning framework that transforms the PDE solving task from learning direct input-output mappings to learning the residuals between similar physical states.<n>Extensive experiments demonstrate consistent and significant improvements across diverse physical systems.
arXiv Detail & Related papers (2024-06-14T07:45:07Z) - Scaling physics-informed hard constraints with mixture-of-experts [0.0]
We develop a scalable approach to enforce hard physical constraints using Mixture-of-Experts (MoE)
MoE imposes the constraint over smaller domains, each of which is solved by an "expert" through differentiable optimization.
Compared to standard differentiable optimization, our scalable approach achieves greater accuracy in the neural PDE solver setting.
arXiv Detail & Related papers (2024-02-20T22:45:00Z) - Physics-Informed Deep Learning of Rate-and-State Fault Friction [0.0]
We develop a multi-network PINN for both the forward problem and for direct inversion of nonlinear fault friction parameters.
We present the computational PINN framework for strike-slip faults in 1D and 2D subject to rate-and-state friction.
We find that the network for the parameter inversion at the fault performs much better than the network for material displacements to which it is coupled.
arXiv Detail & Related papers (2023-12-14T23:53:25Z) - On the Dynamics Under the Unhinged Loss and Beyond [104.49565602940699]
We introduce the unhinged loss, a concise loss function, that offers more mathematical opportunities to analyze closed-form dynamics.
The unhinged loss allows for considering more practical techniques, such as time-vary learning rates and feature normalization.
arXiv Detail & Related papers (2023-12-13T02:11:07Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.<n>We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.<n>Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Physics informed neural networks for continuum micromechanics [68.8204255655161]
Recently, physics informed neural networks have successfully been applied to a broad variety of problems in applied mathematics and engineering.
Due to the global approximation, physics informed neural networks have difficulties in displaying localized effects and strong non-linear solutions by optimization.
It is shown, that the domain decomposition approach is able to accurately resolve nonlinear stress, displacement and energy fields in heterogeneous microstructures obtained from real-world $mu$CT-scans.
arXiv Detail & Related papers (2021-10-14T14:05:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.