PILD: Physics-Informed Learning via Diffusion
- URL: http://arxiv.org/abs/2601.21284v1
- Date: Thu, 29 Jan 2026 05:33:51 GMT
- Title: PILD: Physics-Informed Learning via Diffusion
- Authors: Tianyi Zeng, Tianyi Wang, Jiaru Zhang, Zimo Zeng, Feiyang Zhang, Yiming Xu, Sikai Chen, Yajie Zou, Yangyang Wang, Junfeng Jiao, Christian Claudel, Xinbo Chen,
- Abstract summary: Physics-Informed Learning via Diffusion (PILD) is a framework that unifies diffusion modeling and first-principles physical constraints.<n>PILD substantially improves accuracy, stability, and generalization over existing physics-informed and diffusion-based baselines.
- Score: 10.91770676244394
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Diffusion models have emerged as powerful generative tools for modeling complex data distributions, yet their purely data-driven nature limits applicability in practical engineering and scientific problems where physical laws need to be followed. This paper proposes Physics-Informed Learning via Diffusion (PILD), a framework that unifies diffusion modeling and first-principles physical constraints by introducing a virtual residual observation sampled from a Laplace distribution to supervise generation during training. To further integrate physical laws, a conditional embedding module is incorporated to inject physical information into the denoising network at multiple layers, ensuring consistent guidance throughout the diffusion process. The proposed PILD framework is concise, modular, and broadly applicable to problems governed by ordinary differential equations, partial differential equations, as well as algebraic equations or inequality constraints. Extensive experiments across engineering and scientific tasks including estimating vehicle trajectories, tire forces, Darcy flow and plasma dynamics, demonstrate that our PILD substantially improves accuracy, stability, and generalization over existing physics-informed and diffusion-based baselines.
Related papers
- A Multimodal Conditional Mixture Model with Distribution-Level Physics Priors [0.0]
This work develops a physics-informed multimodal conditional modeling framework based on mixture density representations.<n>Physical knowledge is embedded through component-specific regularization terms that penalize violations of governing equations or physical laws.<n>The proposed framework is evaluated across a range of scientific problems in which multimodality arises from intrinsic physical mechanisms rather than observational noise.
arXiv Detail & Related papers (2026-02-11T02:46:10Z) - Hybrid Generative Modeling for Incomplete Physics: Deep Grey-Box Meets Optimal Transport [48.06072022424773]
Many real-world systems are described only approximately with missing or unknown terms in the equations.<n>This makes the distribution of the physics model differ from the true data-generating process (DGP)<n>We present a novel hybrid generative model approach combining deep grey-box modelling with Optimal Transport (OT) methods to enhance incomplete physics models.
arXiv Detail & Related papers (2025-06-27T13:23:27Z) - Consistent World Models via Foresight Diffusion [56.45012929930605]
We argue that a key bottleneck in learning consistent diffusion-based world models lies in the suboptimal predictive ability.<n>We propose Foresight Diffusion (ForeDiff), a diffusion-based world modeling framework that enhances consistency by decoupling condition understanding from target denoising.
arXiv Detail & Related papers (2025-05-22T10:01:59Z) - Transformers from Diffusion: A Unified Framework for Neural Message Passing [79.9193447649011]
Message passing neural networks (MPNNs) have become a de facto class of model solutions.<n>We propose an energy-constrained diffusion model, which integrates the inductive bias of diffusion with layer-wise constraints of energy.<n>Building on these insights, we devise a new class of message passing models, dubbed Transformers (DIFFormer), whose global attention layers are derived from the principled energy-constrained diffusion framework.
arXiv Detail & Related papers (2024-09-13T17:54:41Z) - Physics-Informed Diffusion Models [0.0]
We present a framework that unifies generative modeling and partial differential equation fulfillment.<n>Our approach reduces the residual error by up to two orders of magnitude compared to previous work in a fluid flow case study.
arXiv Detail & Related papers (2024-03-21T13:52:55Z) - Hybrid data-driven and physics-informed regularized learning of cyclic
plasticity with Neural Networks [0.0]
The proposed model architecture is simpler and more efficient compared to existing solutions from the literature.
The validation of the approach is carried out by means of surrogate data obtained with the Armstrong-Frederick kinematic hardening model.
arXiv Detail & Related papers (2024-03-04T07:09:54Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Learning Homogenization for Elliptic Operators [5.151892549395954]
Multiscale partial differential equations (PDEs) arise in various applications, and several schemes have been developed to solve them efficiently.
Homogenization theory is a powerful methodology that eliminates the small-scale dependence, resulting in simplified equations that are tractable.
This paper investigates the learnability of homogenized laws for elliptic operators in the presence of such complexities.
arXiv Detail & Related papers (2023-06-21T04:05:10Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Moser Flow: Divergence-based Generative Modeling on Manifolds [49.04974733536027]
Moser Flow (MF) is a new class of generative models within the family of continuous normalizing flows (CNF)
MF does not require invoking or backpropagating through an ODE solver during training.
We demonstrate for the first time the use of flow models for sampling from general curved surfaces.
arXiv Detail & Related papers (2021-08-18T09:00:24Z) - A physics-informed operator regression framework for extracting
data-driven continuum models [0.0]
We present a framework for discovering continuum models from high fidelity molecular simulation data.
Our approach applies a neural network parameterization of governing physics in modal space.
We demonstrate the effectiveness of our framework for a variety of physics, including local and nonlocal diffusion processes and single and multiphase flows.
arXiv Detail & Related papers (2020-09-25T01:13:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.