Do Bayesian imaging methods report trustworthy probabilities?
- URL: http://arxiv.org/abs/2405.08179v1
- Date: Mon, 13 May 2024 20:57:01 GMT
- Title: Do Bayesian imaging methods report trustworthy probabilities?
- Authors: David Y. W. Thong, Charlesquin Kemajou Mbakam, Marcelo Pereyra,
- Abstract summary: We run a large experiment requiring 1,000 GPU-hours to probe the accuracy of five canonical Bayesian imaging methods.
We find that, a few cases, the probabilities reported by modern Bayesian imaging techniques are in broad agreement with long-term averages.
Existing Bayesian imaging methods are generally not able to deliver reliable uncertainty quantification results.
- Score: 0.18434042562191813
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian statistics is a cornerstone of imaging sciences, underpinning many and varied approaches from Markov random fields to score-based denoising diffusion models. In addition to powerful image estimation methods, the Bayesian paradigm also provides a framework for uncertainty quantification and for using image data as quantitative evidence. These probabilistic capabilities are important for the rigorous interpretation of experimental results and for robust interfacing of quantitative imaging pipelines with scientific and decision-making processes. However, are the probabilities delivered by existing Bayesian imaging methods meaningful under replication of an experiment, or are they only meaningful as subjective measures of belief? This paper presents a Monte Carlo method to explore this question. We then leverage the proposed Monte Carlo method and run a large experiment requiring 1,000 GPU-hours to probe the accuracy of five canonical Bayesian imaging methods that are representative of some of the main Bayesian imaging strategies from the past decades (a score-based denoising diffusion technique, a plug-and-play Langevin algorithm utilising a Lipschitz-regularised DnCNN denoiser, a Bayesian method with a dictionary-based prior trained subject to a log-concavity constraint, an empirical Bayesian method with a total-variation prior, and a hierarchical Bayesian Gibbs sampler based on a Gaussian Markov random field model). We find that, a few cases, the probabilities reported by modern Bayesian imaging techniques are in broad agreement with long-term averages as observed over a large number of replication of an experiment, but existing Bayesian imaging methods are generally not able to deliver reliable uncertainty quantification results.
Related papers
- Empirical Bayesian image restoration by Langevin sampling with a denoising diffusion implicit prior [0.18434042562191813]
This paper presents a novel and highly computationally efficient image restoration method.
It embeds a DDPM denoiser within an empirical Bayesian Langevin algorithm.
It improves on state-of-the-art strategies both in image estimation accuracy and computing time.
arXiv Detail & Related papers (2024-09-06T16:20:24Z) - Regularization by denoising: Bayesian model and Langevin-within-split
Gibbs sampling [6.453497703172228]
This paper introduces a Bayesian framework for image inversion by deriving a probabilistic counterpart to the regularization-by-denoising (RED) paradigm.
It implements a Monte Carlo algorithm specifically tailored for sampling from the resulting posterior distribution, based on anally exact data augmentation (AXDA)
The proposed algorithm is an approximate instance of split Gibbs sampling (SGS) which embeds one Langevin Monte Carlo step.
arXiv Detail & Related papers (2024-02-19T17:12:16Z) - Equivariant Bootstrapping for Uncertainty Quantification in Imaging
Inverse Problems [0.24475591916185502]
We present a new uncertainty quantification methodology based on an equivariant formulation of the parametric bootstrap algorithm.
The proposed methodology is general and can be easily applied with any image reconstruction technique.
We demonstrate the proposed approach with a series of numerical experiments and through comparisons with alternative uncertainty quantification strategies.
arXiv Detail & Related papers (2023-10-18T09:43:15Z) - BayesCap: Bayesian Identity Cap for Calibrated Uncertainty in Frozen
Neural Networks [50.15201777970128]
We propose BayesCap that learns a Bayesian identity mapping for the frozen model, allowing uncertainty estimation.
BayesCap is a memory-efficient method that can be trained on a small fraction of the original dataset.
We show the efficacy of our method on a wide variety of tasks with a diverse set of architectures.
arXiv Detail & Related papers (2022-07-14T12:50:09Z) - Mining the manifolds of deep generative models for multiple
data-consistent solutions of ill-posed tomographic imaging problems [10.115302976900445]
Tomographic imaging is in general an ill-posed inverse problem.
We propose a new empirical sampling method that computes multiple solutions of a tomographic inverse problem.
arXiv Detail & Related papers (2022-02-10T20:27:31Z) - Deblurring via Stochastic Refinement [85.42730934561101]
We present an alternative framework for blind deblurring based on conditional diffusion models.
Our method is competitive in terms of distortion metrics such as PSNR.
arXiv Detail & Related papers (2021-12-05T04:36:09Z) - PDC-Net+: Enhanced Probabilistic Dense Correspondence Network [161.76275845530964]
Enhanced Probabilistic Dense Correspondence Network, PDC-Net+, capable of estimating accurate dense correspondences.
We develop an architecture and an enhanced training strategy tailored for robust and generalizable uncertainty prediction.
Our approach obtains state-of-the-art results on multiple challenging geometric matching and optical flow datasets.
arXiv Detail & Related papers (2021-09-28T17:56:41Z) - Quantifying Uncertainty in Deep Spatiotemporal Forecasting [67.77102283276409]
We describe two types of forecasting problems: regular grid-based and graph-based.
We analyze UQ methods from both the Bayesian and the frequentist point view, casting in a unified framework via statistical decision theory.
Through extensive experiments on real-world road network traffic, epidemics, and air quality forecasting tasks, we reveal the statistical computational trade-offs for different UQ methods.
arXiv Detail & Related papers (2021-05-25T14:35:46Z) - Learning Accurate Dense Correspondences and When to Trust Them [161.76275845530964]
We aim to estimate a dense flow field relating two images, coupled with a robust pixel-wise confidence map.
We develop a flexible probabilistic approach that jointly learns the flow prediction and its uncertainty.
Our approach obtains state-of-the-art results on challenging geometric matching and optical flow datasets.
arXiv Detail & Related papers (2021-01-05T18:54:11Z) - Tracking disease outbreaks from sparse data with Bayesian inference [55.82986443159948]
The COVID-19 pandemic provides new motivation for estimating the empirical rate of transmission during an outbreak.
Standard methods struggle to accommodate the partial observability and sparse data common at finer scales.
We propose a Bayesian framework which accommodates partial observability in a principled manner.
arXiv Detail & Related papers (2020-09-12T20:37:33Z) - A deep-learning based Bayesian approach to seismic imaging and
uncertainty quantification [0.4588028371034407]
Uncertainty is essential when dealing with ill-conditioned inverse problems.
It is often not possible to formulate a prior distribution that precisely encodes our prior knowledge about the unknown.
We propose to use the functional form of a randomly convolutional neural network as an implicit structured prior.
arXiv Detail & Related papers (2020-01-13T23:46:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.