Fast Scalable Image Restoration using Total Variation Priors and
Expectation Propagation
- URL: http://arxiv.org/abs/2110.01585v1
- Date: Mon, 4 Oct 2021 17:28:41 GMT
- Title: Fast Scalable Image Restoration using Total Variation Priors and
Expectation Propagation
- Authors: Dan Yao, Stephen McLaughlin, Yoann Altmann
- Abstract summary: This paper presents a scalable approximate Bayesian method for image restoration using total variation (TV) priors.
We use the expectation propagation (EP) framework to approximate minimum mean squared error (MMSE) estimators and marginal (pixel-wise) variances.
- Score: 7.7731951589289565
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: This paper presents a scalable approximate Bayesian method for image
restoration using total variation (TV) priors. In contrast to most optimization
methods based on maximum a posteriori estimation, we use the expectation
propagation (EP) framework to approximate minimum mean squared error (MMSE)
estimators and marginal (pixel-wise) variances, without resorting to Monte
Carlo sampling. For the classical anisotropic TV-based prior, we also propose
an iterative scheme to automatically adjust the regularization parameter via
expectation-maximization (EM). Using Gaussian approximating densities with
diagonal covariance matrices, the resulting method allows highly parallelizable
steps and can scale to large images for denoising, deconvolution and
compressive sensing (CS) problems. The simulation results illustrate that such
EP methods can provide a posteriori estimates on par with those obtained via
sampling methods but at a fraction of the computational cost. Moreover, EP does
not exhibit strong underestimation of posteriori variances, in contrast to
variational Bayes alternatives.
Related papers
- Differentially Private Optimization with Sparse Gradients [60.853074897282625]
We study differentially private (DP) optimization problems under sparsity of individual gradients.
Building on this, we obtain pure- and approximate-DP algorithms with almost optimal rates for convex optimization with sparse gradients.
arXiv Detail & Related papers (2024-04-16T20:01:10Z) - Improving Diffusion Models for Inverse Problems Using Optimal Posterior Covariance [52.093434664236014]
Recent diffusion models provide a promising zero-shot solution to noisy linear inverse problems without retraining for specific inverse problems.
Inspired by this finding, we propose to improve recent methods by using more principled covariance determined by maximum likelihood estimation.
arXiv Detail & Related papers (2024-02-03T13:35:39Z) - Variational sparse inverse Cholesky approximation for latent Gaussian
processes via double Kullback-Leibler minimization [6.012173616364571]
We combine a variational approximation of the posterior with a similar and efficient SIC-restricted Kullback-Leibler-optimal approximation of the prior.
For this setting, our variational approximation can be computed via gradient descent in polylogarithmic time per iteration.
We provide numerical comparisons showing that the proposed double-Kullback-Leibler-optimal Gaussian-process approximation (DKLGP) can sometimes be vastly more accurate for stationary kernels than alternative approaches.
arXiv Detail & Related papers (2023-01-30T21:50:08Z) - Variational Laplace Autoencoders [53.08170674326728]
Variational autoencoders employ an amortized inference model to approximate the posterior of latent variables.
We present a novel approach that addresses the limited posterior expressiveness of fully-factorized Gaussian assumption.
We also present a general framework named Variational Laplace Autoencoders (VLAEs) for training deep generative models.
arXiv Detail & Related papers (2022-11-30T18:59:27Z) - Optimization of Annealed Importance Sampling Hyperparameters [77.34726150561087]
Annealed Importance Sampling (AIS) is a popular algorithm used to estimates the intractable marginal likelihood of deep generative models.
We present a parameteric AIS process with flexible intermediary distributions and optimize the bridging distributions to use fewer number of steps for sampling.
We assess the performance of our optimized AIS for marginal likelihood estimation of deep generative models and compare it to other estimators.
arXiv Detail & Related papers (2022-09-27T07:58:25Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - A Probabilistic Deep Image Prior for Computational Tomography [0.19573380763700707]
Existing deep-learning based tomographic image reconstruction methods do not provide accurate estimates of reconstruction uncertainty.
We construct a Bayesian prior for tomographic reconstruction, which combines the classical total variation (TV) regulariser with the modern deep image prior (DIP)
For the inference, we develop an approach based on the linearised Laplace method, which is scalable to high-dimensional settings.
arXiv Detail & Related papers (2022-02-28T14:47:14Z) - Posterior temperature optimized Bayesian models for inverse problems in
medical imaging [59.82184400837329]
We present an unsupervised Bayesian approach to inverse problems in medical imaging using mean-field variational inference with a fully tempered posterior.
We show that an optimized posterior temperature leads to improved accuracy and uncertainty estimation.
Our source code is publicly available at calibrated.com/Cardio-AI/mfvi-dip-mia.
arXiv Detail & Related papers (2022-02-02T12:16:33Z) - Patch-Based Image Restoration using Expectation Propagation [7.7731951589289565]
Monte Carlo techniques can suffer from scalability issues in high-dimensional inference problems such as image restoration.
EP is used here to approximate the posterior distributions using products of multivariate Gaussian densities.
Experiments conducted for denoising, inpainting and deconvolution problems with Gaussian and Poisson noise illustrate the potential benefits of such flexible approximate Bayesian method.
arXiv Detail & Related papers (2021-06-18T10:45:15Z) - Sampling possible reconstructions of undersampled acquisitions in MR
imaging [9.75702493778194]
Undersampling the k-space during MR saves time, however results in an ill-posed inversion problem, leading to an infinite set of images as possible solutions.
Traditionally, this is tackled as a reconstruction problem by searching for a single "best" image out of this solution set according to some chosen regularization or prior.
We propose a method that instead returns multiple images which are possible under the acquisition model and the chosen prior to capture the uncertainty in the inversion process.
arXiv Detail & Related papers (2020-09-30T18:20:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.