EquiReg: Equivariance Regularized Diffusion for Inverse Problems
- URL: http://arxiv.org/abs/2505.22973v1
- Date: Thu, 29 May 2025 01:25:43 GMT
- Title: EquiReg: Equivariance Regularized Diffusion for Inverse Problems
- Authors: Bahareh Tolooshams, Aditi Chandrashekar, Rayhan Zirvi, Abbas Mammadov, Jiachen Yao, Chuwei Wang, Anima Anandkumar,
- Abstract summary: We propose EquiReg diffusion, a framework for regularizing posterior sampling in diffusion-based inverse problem solvers.<n>When applied to a variety of solvers, EquiReg outperforms state-of-the-art diffusion models in both linear and nonlinear image restoration tasks.
- Score: 67.01847869495558
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Diffusion models represent the state-of-the-art for solving inverse problems such as image restoration tasks. In the Bayesian framework, diffusion-based inverse solvers incorporate a likelihood term to guide the prior sampling process, generating data consistent with the posterior distribution. However, due to the intractability of the likelihood term, many current methods rely on isotropic Gaussian approximations, which lead to deviations from the data manifold and result in inconsistent, unstable reconstructions. We propose Equivariance Regularized (EquiReg) diffusion, a general framework for regularizing posterior sampling in diffusion-based inverse problem solvers. EquiReg enhances reconstructions by reweighting diffusion trajectories and penalizing those that deviate from the data manifold. We define a new distribution-dependent equivariance error, empirically identify functions that exhibit low error for on-manifold samples and higher error for off-manifold samples, and leverage these functions to regularize the diffusion sampling process. When applied to a variety of solvers, EquiReg outperforms state-of-the-art diffusion models in both linear and nonlinear image restoration tasks, as well as in reconstructing partial differential equations.
Related papers
- A Mixture-Based Framework for Guiding Diffusion Models [19.83064246586143]
Denoising diffusion models have driven significant progress in the field of Bayesian inverse problems.<n>Recent approaches use pre-trained diffusion models as priors to solve a wide range of such problems.<n>This work proposes a novel mixture approximation of these intermediate distributions.
arXiv Detail & Related papers (2025-02-05T16:26:06Z) - Geophysical inverse problems with measurement-guided diffusion models [0.4532517021515834]
I consider two sampling algorithms recently proposed under the name of Diffusion Posterior Sampling (DPS) and Pseudo-inverse Guided Diffusion Model (PGDM)<n>In DPS, the guidance term is obtained by applying the adjoint of the modeling operator to the residual obtained from a one-step denoising estimate of the solution.<n>On the other hand, PGDM utilizes a pseudo-inverse operator that originates from the fact that the one-step denoised solution is not assumed to be deterministic.
arXiv Detail & Related papers (2025-01-08T23:33:50Z) - G2D2: Gradient-guided Discrete Diffusion for image inverse problem solving [55.185588994883226]
This paper presents a novel method for addressing linear inverse problems by leveraging image-generation models based on discrete diffusion as priors.
To the best of our knowledge, this is the first approach to use discrete diffusion model-based priors for solving image inverse problems.
arXiv Detail & Related papers (2024-10-09T06:18:25Z) - Gaussian is All You Need: A Unified Framework for Solving Inverse Problems via Diffusion Posterior Sampling [16.683393726483978]
Diffusion models can generate a variety of high-quality images by modeling complex data distributions.<n>Most of the existing diffusion-based methods integrate data consistency steps by approximating the likelihood function within the diffusion reverse sampling process.<n>In this paper, we show that the existing approximations are either insufficient or computationally inefficient.
arXiv Detail & Related papers (2024-09-13T15:20:03Z) - Improving Diffusion Models for Inverse Problems Using Optimal Posterior Covariance [52.093434664236014]
Recent diffusion models provide a promising zero-shot solution to noisy linear inverse problems without retraining for specific inverse problems.
Inspired by this finding, we propose to improve recent methods by using more principled covariance determined by maximum likelihood estimation.
arXiv Detail & Related papers (2024-02-03T13:35:39Z) - On Error Propagation of Diffusion Models [77.91480554418048]
We develop a theoretical framework to mathematically formulate error propagation in the architecture of DMs.
We apply the cumulative error as a regularization term to reduce error propagation.
Our proposed regularization reduces error propagation, significantly improves vanilla DMs, and outperforms previous baselines.
arXiv Detail & Related papers (2023-08-09T15:31:17Z) - A Variational Perspective on Solving Inverse Problems with Diffusion
Models [101.831766524264]
Inverse tasks can be formulated as inferring a posterior distribution over data.
This is however challenging in diffusion models since the nonlinear and iterative nature of the diffusion process renders the posterior intractable.
We propose a variational approach that by design seeks to approximate the true posterior distribution.
arXiv Detail & Related papers (2023-05-07T23:00:47Z) - Diffusion Posterior Sampling for General Noisy Inverse Problems [50.873313752797124]
We extend diffusion solvers to handle noisy (non)linear inverse problems via approximation of the posterior sampling.
Our method demonstrates that diffusion models can incorporate various measurement noise statistics.
arXiv Detail & Related papers (2022-09-29T11:12:27Z) - Reliable amortized variational inference with physics-based latent
distribution correction [0.4588028371034407]
A neural network is trained to approximate the posterior distribution over existing pairs of model and data.
The accuracy of this approach relies on the availability of high-fidelity training data.
We show that our correction step improves the robustness of amortized variational inference with respect to changes in number of source experiments, noise variance, and shifts in the prior distribution.
arXiv Detail & Related papers (2022-07-24T02:38:54Z) - Improving Diffusion Models for Inverse Problems using Manifold Constraints [55.91148172752894]
We show that current solvers throw the sample path off the data manifold, and hence the error accumulates.
To address this, we propose an additional correction term inspired by the manifold constraint.
We show that our method is superior to the previous methods both theoretically and empirically.
arXiv Detail & Related papers (2022-06-02T09:06:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.