Differentiable Inverse Modeling with Physics-Constrained Latent Diffusion for Heterogeneous Subsurface Parameter Fields
- URL: http://arxiv.org/abs/2512.22421v1
- Date: Sat, 27 Dec 2025 01:01:19 GMT
- Title: Differentiable Inverse Modeling with Physics-Constrained Latent Diffusion for Heterogeneous Subsurface Parameter Fields
- Authors: Zihan Lin, QiZhi He,
- Abstract summary: We present a latent diffusion-based differentiable inversion method (LD-DIM) for PDE-constrained inverse problems involving high-dimensional spatially distributed coefficients.<n>LD-DIM couples a pretrained latent diffusion prior with an end-to-end differentiable numerical solver to reconstruct unknown heterogeneous parameter fields in a low-dimensional nonlinear manifold.
- Score: 9.42765150299276
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a latent diffusion-based differentiable inversion method (LD-DIM) for PDE-constrained inverse problems involving high-dimensional spatially distributed coefficients. LD-DIM couples a pretrained latent diffusion prior with an end-to-end differentiable numerical solver to reconstruct unknown heterogeneous parameter fields in a low-dimensional nonlinear manifold, improving numerical conditioning and enabling stable gradient-based optimization under sparse observations. The proposed framework integrates a latent diffusion model (LDM), trained in a compact latent space, with a differentiable finite-volume discretization of the forward PDE. Sensitivities are propagated through the discretization using adjoint-based gradients combined with reverse-mode automatic differentiation. Inversion is performed directly in latent space, which implicitly suppresses ill-conditioned degrees of freedom while preserving dominant structural modes, including sharp material interfaces. The effectiveness of LD-DIM is demonstrated using a representative inverse problem for flow in porous media, where heterogeneous conductivity fields are reconstructed from spatially sparse hydraulic head measurements. Numerical experiments assess convergence behavior and reconstruction quality for both Gaussian random fields and bimaterial coefficient distributions. The results show that LD-DIM achieves consistently improved numerical stability and reconstruction accuracy of both parameter fields and corresponding PDE solutions compared with physics-informed neural networks (PINNs) and physics-embedded variational autoencoder (VAE) baselines, while maintaining sharp discontinuities and reducing sensitivity to initialization.
Related papers
- Function-Space Decoupled Diffusion for Forward and Inverse Modeling in Carbon Capture and Storage [65.51149575007149]
We present Fun-DDPS, a generative framework that combines function-space diffusion models with differentiable neural operator surrogates for both forward and inverse modeling.<n>Fun-DDPS produces physically consistent realizations free from the high-frequency artifacts observed in joint-state baselines.
arXiv Detail & Related papers (2026-02-12T18:58:12Z) - Physics-informed diffusion models in spectral space [2.5315729179239637]
We propose a methodology that combines generative latent diffusion models with physics-informed machine learning.<n>We learn the joint distribution of PDE parameters and solutions via a diffusion process in a latent space of scaled spectral representations.<n>We evaluate the proposed approach on Poisson, Helmholtz, and incompressible Navier--Stokes equations, demonstrating improved accuracy and computational efficiency.
arXiv Detail & Related papers (2026-02-10T12:11:07Z) - Decoupled Diffusion Sampling for Inverse Problems on Function Spaces [73.52103661482242]
Existing plug-and-play diffusion posterior samplers represent physics implicitly through coefficient joint-solution modeling.<n>We propose a physics-aware generative framework in function space for inverse PDE problems.<n>Our Decoupled Diffusion Inverse Solver (DDIS) employs a decoupled design: an unconditional diffusion learns the coefficient prior, while a neural operator explicitly models the forward PDE for guidance.
arXiv Detail & Related papers (2026-01-30T18:54:49Z) - Numerical PDE solvers outperform neural PDE solvers [5.303553599778495]
DeepFDM is a finite-difference framework for learning spatially varying coefficients in time-dependent partial differential equations.<n>It enforces stability and first-order convergence via CFL-compliant coefficient parameterizations.<n>It attains normalized mean-squared errors one to two orders of magnitude smaller than Fourier Neural Operators, U-Nets and ResNets.
arXiv Detail & Related papers (2025-07-28T18:50:37Z) - On latent dynamics learning in nonlinear reduced order modeling [0.6249768559720122]
We present the novel mathematical framework of latent dynamics models (LDMs) for reduced order modeling of parameterized nonlinear time-dependent PDEs.<n>A time-continuous setting is employed to derive error and stability estimates for the LDM approximation of the full order model (FOM) solution.<n>Deep neural networks approximate the discrete LDM components, while providing a bounded approximation error with respect to the FOM.
arXiv Detail & Related papers (2024-08-27T16:35:06Z) - Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - Amortized Posterior Sampling with Diffusion Prior Distillation [55.03585818289934]
Amortized Posterior Sampling is a novel variational inference approach for efficient posterior sampling in inverse problems.<n>Our method trains a conditional flow model to minimize the divergence between the variational distribution and the posterior distribution implicitly defined by the diffusion model.<n>Unlike existing methods, our approach is unsupervised, requires no paired training data, and is applicable to both Euclidean and non-Euclidean domains.
arXiv Detail & Related papers (2024-07-25T09:53:12Z) - AdjointDEIS: Efficient Gradients for Diffusion Models [2.0795007613453445]
We show that continuous adjoint equations for diffusion SDEs actually simplify to a simple ODE.<n>We also demonstrate the effectiveness of AdjointDEIS for guided generation with an adversarial attack in the form of the face morphing problem.
arXiv Detail & Related papers (2024-05-23T19:51:33Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Learning Discretized Neural Networks under Ricci Flow [48.47315844022283]
We study Discretized Neural Networks (DNNs) composed of low-precision weights and activations.<n>DNNs suffer from either infinite or zero gradients due to the non-differentiable discrete function during training.
arXiv Detail & Related papers (2023-02-07T10:51:53Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Stationary Density Estimation of It\^o Diffusions Using Deep Learning [6.8342505943533345]
We consider the density estimation problem associated with the stationary measure of ergodic Ito diffusions from a discrete-time series.
We employ deep neural networks to approximate the drift and diffusion terms of the SDE.
We establish the convergence of the proposed scheme under appropriate mathematical assumptions.
arXiv Detail & Related papers (2021-09-09T01:57:14Z) - Non-intrusive reduced order modeling of poroelasticity of heterogeneous
media based on a discontinuous Galerkin approximation [0.0]
We present a non-intrusive model reduction framework for linear poroelasticity problems in heterogeneous porous media.
We utilize the interior penalty discontinuous Galerkin (DG) method as a full order solver to handle discontinuity.
We show that our framework provides reasonable approximations of the DG solution, but it is significantly faster.
arXiv Detail & Related papers (2021-01-28T04:21:06Z) - Stochastic Normalizing Flows [52.92110730286403]
We introduce normalizing flows for maximum likelihood estimation and variational inference (VI) using differential equations (SDEs)
Using the theory of rough paths, the underlying Brownian motion is treated as a latent variable and approximated, enabling efficient training of neural SDEs.
These SDEs can be used for constructing efficient chains to sample from the underlying distribution of a given dataset.
arXiv Detail & Related papers (2020-02-21T20:47:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.