On Maximum-a-Posteriori estimation with Plug & Play priors and
stochastic gradient descent
- URL: http://arxiv.org/abs/2201.06133v1
- Date: Sun, 16 Jan 2022 20:50:08 GMT
- Title: On Maximum-a-Posteriori estimation with Plug & Play priors and
stochastic gradient descent
- Authors: R\'emi Laumont and Valentin de Bortoli and Andr\'es Almansa and Julie
Delon and Alain Durmus and Marcelo Pereyra
- Abstract summary: Methods to solve imaging problems usually combine an explicit data likelihood function with a prior that explicitly expected properties of the solution.
In a departure from explicit modelling, several recent works have proposed and studied the use of implicit priors defined by an image denoising algorithm.
- Score: 13.168923974530307
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian methods to solve imaging inverse problems usually combine an
explicit data likelihood function with a prior distribution that explicitly
models expected properties of the solution. Many kinds of priors have been
explored in the literature, from simple ones expressing local properties to
more involved ones exploiting image redundancy at a non-local scale. In a
departure from explicit modelling, several recent works have proposed and
studied the use of implicit priors defined by an image denoising algorithm.
This approach, commonly known as Plug & Play (PnP) regularisation, can deliver
remarkably accurate results, particularly when combined with state-of-the-art
denoisers based on convolutional neural networks. However, the theoretical
analysis of PnP Bayesian models and algorithms is difficult and works on the
topic often rely on unrealistic assumptions on the properties of the image
denoiser. This papers studies maximum-a-posteriori (MAP) estimation for
Bayesian models with PnP priors. We first consider questions related to
existence, stability and well-posedness, and then present a convergence proof
for MAP computation by PnP stochastic gradient descent (PnP-SGD) under
realistic assumptions on the denoiser used. We report a range of imaging
experiments demonstrating PnP-SGD as well as comparisons with other PnP
schemes.
Related papers
- Provably Robust Score-Based Diffusion Posterior Sampling for Plug-and-Play Image Reconstruction [31.503662384666274]
In science and engineering, the goal is to infer an unknown image from a small number of measurements collected from a known forward model describing certain imaging modality.
Motivated Score-based diffusion models, due to its empirical success, have emerged as an impressive candidate of an exemplary prior in image reconstruction.
arXiv Detail & Related papers (2024-03-25T15:58:26Z) - Exploiting Diffusion Prior for Generalizable Dense Prediction [85.4563592053464]
Recent advanced Text-to-Image (T2I) diffusion models are sometimes too imaginative for existing off-the-shelf dense predictors to estimate.
We introduce DMP, a pipeline utilizing pre-trained T2I models as a prior for dense prediction tasks.
Despite limited-domain training data, the approach yields faithful estimations for arbitrary images, surpassing existing state-of-the-art algorithms.
arXiv Detail & Related papers (2023-11-30T18:59:44Z) - Generalizing Backpropagation for Gradient-Based Interpretability [103.2998254573497]
We show that the gradient of a model is a special case of a more general formulation using semirings.
This observation allows us to generalize the backpropagation algorithm to efficiently compute other interpretable statistics.
arXiv Detail & Related papers (2023-07-06T15:19:53Z) - Learning Unnormalized Statistical Models via Compositional Optimization [73.30514599338407]
Noise-contrastive estimation(NCE) has been proposed by formulating the objective as the logistic loss of the real data and the artificial noise.
In this paper, we study it a direct approach for optimizing the negative log-likelihood of unnormalized models.
arXiv Detail & Related papers (2023-06-13T01:18:16Z) - Poisson-Gaussian Holographic Phase Retrieval with Score-based Image
Prior [19.231581775644617]
We propose a new algorithm called "AWFS" that uses the accelerated Wirtinger flow (AWF) with a score function as generative prior.
We calculate the gradient of the log-likelihood function for PR and determine the Lipschitz constant.
We provide theoretical analysis that establishes a critical-point convergence guarantee for the proposed algorithm.
arXiv Detail & Related papers (2023-05-12T18:08:47Z) - Plug-and-Play split Gibbs sampler: embedding deep generative priors in
Bayesian inference [12.91637880428221]
This paper introduces a plug-and-play sampling algorithm that leverages variable splitting to efficiently sample from a posterior distribution.
It divides the challenging task of posterior sampling into two simpler sampling problems.
Its performance is compared to recent state-of-the-art optimization and sampling methods.
arXiv Detail & Related papers (2023-04-21T17:17:51Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - Recovery Analysis for Plug-and-Play Priors using the Restricted
Eigenvalue Condition [48.08511796234349]
We show how to establish theoretical recovery guarantees for the plug-and-play priors (noise) and regularization by denoising (RED) methods.
Our results suggest that models with a pre-trained artifact removal network provides significantly better results compared to existing state-of-the-art methods.
arXiv Detail & Related papers (2021-06-07T14:45:38Z) - Bayesian imaging using Plug & Play priors: when Langevin meets Tweedie [13.476505672245603]
This paper develops theory, methods, and provably convergent algorithms for performing Bayesian inference with priors.
We introduce two algorithms: 1) -ULA (Unadjusted Langevin) Algorithm inference for Monte Carlo sampling and MMSE; and 2) quantitative-SGD (Stochastic Gradient Descent) for inference.
The algorithms are demonstrated on several problems such as image denoisering, inpainting, and denoising, where they are used for point estimation as well as for uncertainty visualisation and regularity.
arXiv Detail & Related papers (2021-03-08T12:46:53Z) - Scalable Plug-and-Play ADMM with Convergence Guarantees [24.957046830965822]
We propose an incremental variant of the widely used.
ADMM algorithm, making it scalable to large-scale datasets.
We theoretically analyze the convergence algorithm under a set explicit assumptions.
arXiv Detail & Related papers (2020-06-05T04:10:15Z) - The Power of Triply Complementary Priors for Image Compressive Sensing [89.14144796591685]
We propose a joint low-rank deep (LRD) image model, which contains a pair of complementaryly trip priors.
We then propose a novel hybrid plug-and-play framework based on the LRD model for image CS.
To make the optimization tractable, a simple yet effective algorithm is proposed to solve the proposed H-based image CS problem.
arXiv Detail & Related papers (2020-05-16T08:17:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.