Solving Linear Inverse Problems Provably via Posterior Sampling with
Latent Diffusion Models
- URL: http://arxiv.org/abs/2307.00619v1
- Date: Sun, 2 Jul 2023 17:21:30 GMT
- Title: Solving Linear Inverse Problems Provably via Posterior Sampling with
Latent Diffusion Models
- Authors: Litu Rout and Negin Raoof and Giannis Daras and Constantine Caramanis
and Alexandros G. Dimakis and Sanjay Shakkottai
- Abstract summary: We present the first framework to solve linear inverse problems leveraging pre-trained latent diffusion models.
We theoretically analyze our algorithm showing provable sample recovery in a linear model setting.
- Score: 98.95988351420334
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present the first framework to solve linear inverse problems leveraging
pre-trained latent diffusion models. Previously proposed algorithms (such as
DPS and DDRM) only apply to pixel-space diffusion models. We theoretically
analyze our algorithm showing provable sample recovery in a linear model
setting. The algorithmic insight obtained from our analysis extends to more
general settings often considered in practice. Experimentally, we outperform
previously proposed posterior sampling algorithms in a wide variety of problems
including random inpainting, block inpainting, denoising, deblurring,
destriping, and super-resolution.
Related papers
- Diffusion Prior-Based Amortized Variational Inference for Noisy Inverse Problems [12.482127049881026]
We propose a novel approach to solve inverse problems with a diffusion prior from an amortized variational inference perspective.
Our amortized inference learns a function that directly maps measurements to the implicit posterior distributions of corresponding clean data, enabling a single-step posterior sampling even for unseen measurements.
arXiv Detail & Related papers (2024-07-23T02:14:18Z) - Principled Probabilistic Imaging using Diffusion Models as Plug-and-Play Priors [29.203951468436145]
Diffusion models (DMs) have recently shown outstanding capabilities in modeling complex image distributions.
We propose a Markov chain Monte Carlo algorithm that performs posterior sampling for general inverse problems.
We demonstrate the effectiveness of the proposed method on six inverse problems.
arXiv Detail & Related papers (2024-05-29T05:42:25Z) - Divide-and-Conquer Posterior Sampling for Denoising Diffusion Priors [21.51814794909746]
In this work, we take a different approach to define a set of intermediate and simpler posterior sampling problems, resulting in a lower approximation error compared to previous methods.
We empirically demonstrate the reconstruction capability of our method for general linear inverse problems using synthetic examples and various image restoration tasks.
arXiv Detail & Related papers (2024-03-18T01:47:24Z) - Solving Inverse Problems with Latent Diffusion Models via Hard Data Consistency [7.671153315762146]
Training diffusion models in the pixel space are both data-intensive and computationally demanding.
Latent diffusion models, which operate in a much lower-dimensional space, offer a solution to these challenges.
We propose textitReSample, an algorithm that can solve general inverse problems with pre-trained latent diffusion models.
arXiv Detail & Related papers (2023-07-16T18:42:01Z) - A Variational Perspective on Solving Inverse Problems with Diffusion
Models [101.831766524264]
Inverse tasks can be formulated as inferring a posterior distribution over data.
This is however challenging in diffusion models since the nonlinear and iterative nature of the diffusion process renders the posterior intractable.
We propose a variational approach that by design seeks to approximate the true posterior distribution.
arXiv Detail & Related papers (2023-05-07T23:00:47Z) - Variational Laplace Autoencoders [53.08170674326728]
Variational autoencoders employ an amortized inference model to approximate the posterior of latent variables.
We present a novel approach that addresses the limited posterior expressiveness of fully-factorized Gaussian assumption.
We also present a general framework named Variational Laplace Autoencoders (VLAEs) for training deep generative models.
arXiv Detail & Related papers (2022-11-30T18:59:27Z) - Improving Diffusion Models for Inverse Problems using Manifold Constraints [55.91148172752894]
We show that current solvers throw the sample path off the data manifold, and hence the error accumulates.
To address this, we propose an additional correction term inspired by the manifold constraint.
We show that our method is superior to the previous methods both theoretically and empirically.
arXiv Detail & Related papers (2022-06-02T09:06:10Z) - Denoising Diffusion Restoration Models [110.1244240726802]
Denoising Diffusion Restoration Models (DDRM) is an efficient, unsupervised posterior sampling method.
We demonstrate DDRM's versatility on several image datasets for super-resolution, deblurring, inpainting, and colorization.
arXiv Detail & Related papers (2022-01-27T20:19:07Z) - Solving Inverse Problems with a Flow-based Noise Model [100.18560761392692]
We study image inverse problems with a normalizing flow prior.
Our formulation views the solution as the maximum a posteriori estimate of the image conditioned on the measurements.
We empirically validate the efficacy of our method on various inverse problems, including compressed sensing with quantized measurements and denoising with highly structured noise patterns.
arXiv Detail & Related papers (2020-03-18T08:33:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.