Denoising Diffusion Restoration Tackles Forward and Inverse Problems for
the Laplace Operator
- URL: http://arxiv.org/abs/2402.08563v2
- Date: Wed, 14 Feb 2024 21:30:39 GMT
- Title: Denoising Diffusion Restoration Tackles Forward and Inverse Problems for
the Laplace Operator
- Authors: Amartya Mukherjee, Melissa M. Stadt, Lena Podina, Mohammad Kohandel,
Jun Liu
- Abstract summary: This paper presents a novel approach for the inverse and forward solution of PDEs through the use of denoising diffusion restoration models (DDRM)
DDRMs were used in linear inverse problems to restore original clean signals by exploiting the singular value decomposition (SVD) of the linear operator.
Our results show that using denoising diffusion restoration significantly improves the estimation of the solution and parameters.
- Score: 3.8426297727671352
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Diffusion models have emerged as a promising class of generative models that
map noisy inputs to realistic images. More recently, they have been employed to
generate solutions to partial differential equations (PDEs). However, they
still struggle with inverse problems in the Laplacian operator, for instance,
the Poisson equation, because the eigenvalues that are large in magnitude
amplify the measurement noise. This paper presents a novel approach for the
inverse and forward solution of PDEs through the use of denoising diffusion
restoration models (DDRM). DDRMs were used in linear inverse problems to
restore original clean signals by exploiting the singular value decomposition
(SVD) of the linear operator. Equivalently, we present an approach to restore
the solution and the parameters in the Poisson equation by exploiting the
eigenvalues and the eigenfunctions of the Laplacian operator. Our results show
that using denoising diffusion restoration significantly improves the
estimation of the solution and parameters. Our research, as a result, pioneers
the integration of diffusion models with the principles of underlying physics
to solve PDEs.
Related papers
- Stability and Generalizability in SDE Diffusion Models with Measure-Preserving Dynamics [11.919291977879801]
Inverse problems describe the process of estimating the causal factors from a set of measurements or data.
Diffusion models have shown promise as potent generative tools for solving inverse problems.
arXiv Detail & Related papers (2024-06-19T15:55:12Z) - ODE-DPS: ODE-based Diffusion Posterior Sampling for Inverse Problems in Partial Differential Equation [1.8356973269166506]
We introduce a novel unsupervised inversion methodology tailored for solving inverse problems arising from PDEs.
Our approach operates within the Bayesian inversion framework, treating the task of solving the posterior distribution as a conditional generation process.
To enhance the accuracy of inversion results, we propose an ODE-based Diffusion inversion algorithm.
arXiv Detail & Related papers (2024-04-21T00:57:13Z) - Divide-and-Conquer Posterior Sampling for Denoising Diffusion Priors [21.51814794909746]
In this work, we take a different approach to define a set of intermediate and simpler posterior sampling problems, resulting in a lower approximation error compared to previous methods.
We empirically demonstrate the reconstruction capability of our method for general linear inverse problems using synthetic examples and various image restoration tasks.
arXiv Detail & Related papers (2024-03-18T01:47:24Z) - A Variational Perspective on Solving Inverse Problems with Diffusion
Models [101.831766524264]
Inverse tasks can be formulated as inferring a posterior distribution over data.
This is however challenging in diffusion models since the nonlinear and iterative nature of the diffusion process renders the posterior intractable.
We propose a variational approach that by design seeks to approximate the true posterior distribution.
arXiv Detail & Related papers (2023-05-07T23:00:47Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - GibbsDDRM: A Partially Collapsed Gibbs Sampler for Solving Blind Inverse
Problems with Denoising Diffusion Restoration [64.8770356696056]
We propose GibbsDDRM, an extension of Denoising Diffusion Restoration Models (DDRM) to a blind setting in which the linear measurement operator is unknown.
The proposed method is problem-agnostic, meaning that a pre-trained diffusion model can be applied to various inverse problems without fine-tuning.
arXiv Detail & Related papers (2023-01-30T06:27:48Z) - Variational Laplace Autoencoders [53.08170674326728]
Variational autoencoders employ an amortized inference model to approximate the posterior of latent variables.
We present a novel approach that addresses the limited posterior expressiveness of fully-factorized Gaussian assumption.
We also present a general framework named Variational Laplace Autoencoders (VLAEs) for training deep generative models.
arXiv Detail & Related papers (2022-11-30T18:59:27Z) - JPEG Artifact Correction using Denoising Diffusion Restoration Models [110.1244240726802]
We build upon Denoising Diffusion Restoration Models (DDRM) and propose a method for solving some non-linear inverse problems.
We leverage the pseudo-inverse operator used in DDRM and generalize this concept for other measurement operators.
arXiv Detail & Related papers (2022-09-23T23:47:00Z) - Differentiable Gaussianization Layers for Inverse Problems Regularized
by Deep Generative Models [0.0]
We show that latent tensors of deep generative models can fall out of the desired high-dimensional standard Gaussian distribution during inversion.
Our approach achieves state-of-the-art performance in terms of accuracy and consistency.
arXiv Detail & Related papers (2021-12-07T17:53:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.