Scalable diffusion posterior sampling in infinite-dimensional inverse problems
- URL: http://arxiv.org/abs/2405.15643v2
- Date: Mon, 03 Feb 2025 08:49:31 GMT
- Title: Scalable diffusion posterior sampling in infinite-dimensional inverse problems
- Authors: Fabian Schneider, Duc-Lam Duong, Matti Lassas, Maarten V. de Hoop, Tapio Helin,
- Abstract summary: We propose a scalable diffusion posterior sampling (SDPS) method to bypass forward mapping evaluations during sampling.
The approach is shown to generalize to infinite-dimensional diffusion models and is validated through rigorous convergence analysis and high-dimensional CT imaging experiments.
- Score: 5.340736751238338
- License:
- Abstract: Score-based diffusion models (SDMs) have emerged as a powerful tool for sampling from the posterior distribution in Bayesian inverse problems. However, existing methods often require multiple evaluations of the forward mapping to generate a single sample, resulting in significant computational costs for large-scale inverse problems. To address this issue, we propose a scalable diffusion posterior sampling (SDPS) method to bypass forward mapping evaluations during sampling by shifting computational effort to an offline training phase, where a task-dependent score is learned based on the forward mapping. Crucially, the conditional posterior score is then derived from this trained score using affine transformations, ensuring no conditional score approximation is needed. The approach is shown to generalize to infinite-dimensional diffusion models and is validated through rigorous convergence analysis and high-dimensional CT imaging experiments.
Related papers
- Geophysical inverse problems with measurement-guided diffusion models [0.4532517021515834]
I consider two sampling algorithms recently proposed under the name of Diffusion Posterior Sampling (DPS) and Pseudo-inverse Guided Diffusion Model (PGDM)
In DPS, the guidance term is obtained by applying the adjoint of the modeling operator to the residual obtained from a one-step denoising estimate of the solution.
On the other hand, PGDM utilizes a pseudo-inverse operator that originates from the fact that the one-step denoised solution is not assumed to be deterministic.
arXiv Detail & Related papers (2025-01-08T23:33:50Z) - Enhancing Diffusion Models for Inverse Problems with Covariance-Aware Posterior Sampling [3.866047645663101]
In computer vision, for example, tasks such as inpainting, deblurring, and super resolution can be effectively modeled as inverse problems.
DDPMs are shown to provide a promising solution to noisy linear inverse problems without the need for additional task specific training.
arXiv Detail & Related papers (2024-12-28T06:17:44Z) - Arbitrary-steps Image Super-resolution via Diffusion Inversion [68.78628844966019]
This study presents a new image super-resolution (SR) technique based on diffusion inversion, aiming at harnessing the rich image priors encapsulated in large pre-trained diffusion models to improve SR performance.
We design a Partial noise Prediction strategy to construct an intermediate state of the diffusion model, which serves as the starting sampling point.
Once trained, this noise predictor can be used to initialize the sampling process partially along the diffusion trajectory, generating the desirable high-resolution result.
arXiv Detail & Related papers (2024-12-12T07:24:13Z) - Score-Based Variational Inference for Inverse Problems [19.848238197979157]
In applications that posterior mean is preferred, we have to generate multiple samples from the posterior which is time-consuming.
We establish a framework termed reverse mean propagation (RMP) that targets the posterior mean directly.
We develop an algorithm that optimize the reverse KL divergence with natural gradient descent using score functions and propagates the mean at each reverse step.
arXiv Detail & Related papers (2024-10-08T02:55:16Z) - Diffusion Prior-Based Amortized Variational Inference for Noisy Inverse Problems [12.482127049881026]
We propose a novel approach to solve inverse problems with a diffusion prior from an amortized variational inference perspective.
Our amortized inference learns a function that directly maps measurements to the implicit posterior distributions of corresponding clean data, enabling a single-step posterior sampling even for unseen measurements.
arXiv Detail & Related papers (2024-07-23T02:14:18Z) - Divide-and-Conquer Posterior Sampling for Denoising Diffusion Priors [21.0128625037708]
We present an innovative framework, divide-and-conquer posterior sampling.
It reduces the approximation error associated with current techniques without the need for retraining.
We demonstrate the versatility and effectiveness of our approach for a wide range of Bayesian inverse problems.
arXiv Detail & Related papers (2024-03-18T01:47:24Z) - Improving Diffusion Models for Inverse Problems Using Optimal Posterior Covariance [52.093434664236014]
Recent diffusion models provide a promising zero-shot solution to noisy linear inverse problems without retraining for specific inverse problems.
Inspired by this finding, we propose to improve recent methods by using more principled covariance determined by maximum likelihood estimation.
arXiv Detail & Related papers (2024-02-03T13:35:39Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Variational Laplace Autoencoders [53.08170674326728]
Variational autoencoders employ an amortized inference model to approximate the posterior of latent variables.
We present a novel approach that addresses the limited posterior expressiveness of fully-factorized Gaussian assumption.
We also present a general framework named Variational Laplace Autoencoders (VLAEs) for training deep generative models.
arXiv Detail & Related papers (2022-11-30T18:59:27Z) - Diffusion Posterior Sampling for General Noisy Inverse Problems [50.873313752797124]
We extend diffusion solvers to handle noisy (non)linear inverse problems via approximation of the posterior sampling.
Our method demonstrates that diffusion models can incorporate various measurement noise statistics.
arXiv Detail & Related papers (2022-09-29T11:12:27Z) - Improving Diffusion Models for Inverse Problems using Manifold Constraints [55.91148172752894]
We show that current solvers throw the sample path off the data manifold, and hence the error accumulates.
To address this, we propose an additional correction term inspired by the manifold constraint.
We show that our method is superior to the previous methods both theoretically and empirically.
arXiv Detail & Related papers (2022-06-02T09:06:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.