A PDE-Informed Latent Diffusion Model for 2-m Temperature Downscaling
- URL: http://arxiv.org/abs/2510.23866v1
- Date: Mon, 27 Oct 2025 21:17:03 GMT
- Title: A PDE-Informed Latent Diffusion Model for 2-m Temperature Downscaling
- Authors: Paul Rosu, Muchang Bahng, Erick Jiang, Rico Zhu, Vahid Tarokh,
- Abstract summary: This work presents a physics-conditioned latent diffusion model tailored for dynamical downscaling of atmospheric data.<n>Building upon a pre-existing diffusion architecture, we integrate a partial differential equation (PDE) loss term into the model's training objective.<n>We investigate how fine-tuning with this additional loss further regularizes the model and enhances the physical plausibility of the generated fields.
- Score: 14.475761134288573
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This work presents a physics-conditioned latent diffusion model tailored for dynamical downscaling of atmospheric data, with a focus on reconstructing high-resolution 2-m temperature fields. Building upon a pre-existing diffusion architecture and employing a residual formulation against a reference UNet, we integrate a partial differential equation (PDE) loss term into the model's training objective. The PDE loss is computed in the full resolution (pixel) space by decoding the latent representation and is designed to enforce physical consistency through a finite-difference approximation of an effective advection-diffusion balance. Empirical observations indicate that conventional diffusion training already yields low PDE residuals, and we investigate how fine-tuning with this additional loss further regularizes the model and enhances the physical plausibility of the generated fields. The entirety of our codebase is available on Github, for future reference and development.
Related papers
- Physics-informed diffusion models in spectral space [2.5315729179239637]
We propose a methodology that combines generative latent diffusion models with physics-informed machine learning.<n>We learn the joint distribution of PDE parameters and solutions via a diffusion process in a latent space of scaled spectral representations.<n>We evaluate the proposed approach on Poisson, Helmholtz, and incompressible Navier--Stokes equations, demonstrating improved accuracy and computational efficiency.
arXiv Detail & Related papers (2026-02-10T12:11:07Z) - Condition Errors Refinement in Autoregressive Image Generation with Diffusion Loss [56.120591983649824]
We present a theoretical analysis of diffusion and autoregressive models with diffusion loss.<n>We show that patch denoising optimization in autoregressive models effectively mitigates condition errors and leads to a stable condition distribution.<n>We introduce a novel condition refinement approach based on Optimal Transport (OT) theory to address condition inconsistency''
arXiv Detail & Related papers (2026-02-02T07:48:04Z) - A PDE Perspective on Generative Diffusion Models [8.328108675535562]
We develop a rigorous partial differential equation (PDE) framework for score-based diffusion processes.<n>We derive sharp $Lp$-stability estimates for the associated score-based Fokker-Planck dynamics.<n>Results yield a theoretical guarantee that, under exact guidance, diffusion trajectories return to the data manifold.
arXiv Detail & Related papers (2025-11-08T09:19:25Z) - Physics-Informed Distillation of Diffusion Models for PDE-Constrained Generation [19.734778762515468]
diffusion models have gained increasing attention in the modeling of physical systems, particularly those governed by partial differential equations (PDEs)<n>We propose a simple yet effective post-hoc distillation approach, where PDE constraints are not injected directly into the diffusion process, but instead enforced during a post-hoc distillation stage.
arXiv Detail & Related papers (2025-05-28T14:17:58Z) - Guided Diffusion Sampling on Function Spaces with Applications to PDEs [112.09025802445329]
We propose a general framework for conditional sampling in PDE-based inverse problems.<n>This is accomplished by a function-space diffusion model and plug-and-play guidance for conditioning.<n>Our method achieves an average 32% accuracy improvement over state-of-the-art fixed-resolution diffusion baselines.
arXiv Detail & Related papers (2025-05-22T17:58:12Z) - Generative Latent Neural PDE Solver using Flow Matching [8.397730500554047]
We propose a latent diffusion model for PDE simulation that embeds the PDE state in a lower-dimensional latent space.<n>Our framework uses an autoencoder to map different types of meshes onto a unified structured latent grid, capturing complex geometries.<n> Numerical experiments show that the proposed model outperforms several deterministic baselines in both accuracy and long-term stability.
arXiv Detail & Related papers (2025-03-28T16:44:28Z) - Advancing Generalization in PINNs through Latent-Space Representations [71.86401914779019]
Physics-informed neural networks (PINNs) have made significant strides in modeling dynamical systems governed by partial differential equations (PDEs)<n>We propose PIDO, a novel physics-informed neural PDE solver designed to generalize effectively across diverse PDE configurations.<n>We validate PIDO on a range of benchmarks, including 1D combined equations and 2D Navier-Stokes equations.
arXiv Detail & Related papers (2024-11-28T13:16:20Z) - Distilling Diffusion Models into Conditional GANs [90.76040478677609]
We distill a complex multistep diffusion model into a single-step conditional GAN student model.
For efficient regression loss, we propose E-LatentLPIPS, a perceptual loss operating directly in diffusion model's latent space.
We demonstrate that our one-step generator outperforms cutting-edge one-step diffusion distillation models.
arXiv Detail & Related papers (2024-05-09T17:59:40Z) - Closing the ODE-SDE gap in score-based diffusion models through the
Fokker-Planck equation [0.562479170374811]
We rigorously describe the range of dynamics and approximations that arise when training score-based diffusion models.
We show numerically that conventional score-based diffusion models can exhibit significant differences between ODE- and SDE-induced distributions.
arXiv Detail & Related papers (2023-11-27T16:44:50Z) - Exploring the Optimal Choice for Generative Processes in Diffusion
Models: Ordinary vs Stochastic Differential Equations [6.2284442126065525]
We study the problem mathematically for two limiting scenarios: the zero diffusion (ODE) case and the large diffusion case.
Our findings indicate that when the perturbation occurs at the end of the generative process, the ODE model outperforms the SDE model with a large diffusion coefficient.
arXiv Detail & Related papers (2023-06-03T09:27:15Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Unifying Diffusion Models' Latent Space, with Applications to
CycleDiffusion and Guidance [95.12230117950232]
We show that a common latent space emerges from two diffusion models trained independently on related domains.
Applying CycleDiffusion to text-to-image diffusion models, we show that large-scale text-to-image diffusion models can be used as zero-shot image-to-image editors.
arXiv Detail & Related papers (2022-10-11T15:53:52Z) - Stationary Density Estimation of It\^o Diffusions Using Deep Learning [6.8342505943533345]
We consider the density estimation problem associated with the stationary measure of ergodic Ito diffusions from a discrete-time series.
We employ deep neural networks to approximate the drift and diffusion terms of the SDE.
We establish the convergence of the proposed scheme under appropriate mathematical assumptions.
arXiv Detail & Related papers (2021-09-09T01:57:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.