With or Without Replacement? Improving Confidence in Fourier Imaging
- URL: http://arxiv.org/abs/2407.13575v1
- Date: Thu, 18 Jul 2024 15:15:19 GMT
- Title: With or Without Replacement? Improving Confidence in Fourier Imaging
- Authors: Frederik Hoppe, Claudio Mayrink Verdun, Felix Krahmer, Marion I. Menzel, Holger Rauhut,
- Abstract summary: We show how a transition between sampling with and without replacement can lead to a weighted reconstruction scheme with improved performance for the standard LASSO.
In this paper, we illustrate how this reweighted sampling idea can also improve the debiased estimator.
- Score: 5.542462410129539
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Over the last few years, debiased estimators have been proposed in order to establish rigorous confidence intervals for high-dimensional problems in machine learning and data science. The core argument is that the error of these estimators with respect to the ground truth can be expressed as a Gaussian variable plus a remainder term that vanishes as long as the dimension of the problem is sufficiently high. Thus, uncertainty quantification (UQ) can be performed exploiting the Gaussian model. Empirically, however, the remainder term cannot be neglected in many realistic situations of moderately-sized dimensions, in particular in certain structured measurement scenarios such as Magnetic Resonance Imaging (MRI). This, in turn, can downgrade the advantage of the UQ methods as compared to non-UQ approaches such as the standard LASSO. In this paper, we present a method to improve the debiased estimator by sampling without replacement. Our approach leverages recent results of ours on the structure of the random nature of certain sampling schemes showing how a transition between sampling with and without replacement can lead to a weighted reconstruction scheme with improved performance for the standard LASSO. In this paper, we illustrate how this reweighted sampling idea can also improve the debiased estimator and, consequently, provide a better method for UQ in Fourier imaging.
Related papers
- Relaxed Quantile Regression: Prediction Intervals for Asymmetric Noise [51.87307904567702]
Quantile regression is a leading approach for obtaining such intervals via the empirical estimation of quantiles in the distribution of outputs.
We propose Relaxed Quantile Regression (RQR), a direct alternative to quantile regression based interval construction that removes this arbitrary constraint.
We demonstrate that this added flexibility results in intervals with an improvement in desirable qualities.
arXiv Detail & Related papers (2024-06-05T13:36:38Z) - Modeling uncertainty for Gaussian Splatting [21.836830270709]
We present the first framework for uncertainty estimation using Gaussian Splatting (GS)
We introduce a Variational Inference-based approach that seamlessly integrates uncertainty prediction into the common rendering pipeline of GS.
We also introduce the Area Under Sparsification Error (AUSE) as a new term in the loss function, enabling optimization of uncertainty estimation alongside image reconstruction.
arXiv Detail & Related papers (2024-03-27T11:45:08Z) - Deep Variational Inverse Scattering [18.598311270757527]
Inverse medium scattering solvers generally reconstruct a single solution without an associated measure of uncertainty.
Deep networks such as conditional normalizing flows can be used to sample posteriors in inverse problems.
We propose U-Flow, a Bayesian U-Net based on conditional normalizing flows, which generates high-quality posterior samples and estimates physically-meaningful uncertainty.
arXiv Detail & Related papers (2022-12-08T14:57:06Z) - Deblurring via Stochastic Refinement [85.42730934561101]
We present an alternative framework for blind deblurring based on conditional diffusion models.
Our method is competitive in terms of distortion metrics such as PSNR.
arXiv Detail & Related papers (2021-12-05T04:36:09Z) - Learning to Estimate Without Bias [57.82628598276623]
Gauss theorem states that the weighted least squares estimator is a linear minimum variance unbiased estimation (MVUE) in linear models.
In this paper, we take a first step towards extending this result to non linear settings via deep learning with bias constraints.
A second motivation to BCE is in applications where multiple estimates of the same unknown are averaged for improved performance.
arXiv Detail & Related papers (2021-10-24T10:23:51Z) - Fast Scalable Image Restoration using Total Variation Priors and
Expectation Propagation [7.7731951589289565]
This paper presents a scalable approximate Bayesian method for image restoration using total variation (TV) priors.
We use the expectation propagation (EP) framework to approximate minimum mean squared error (MMSE) estimators and marginal (pixel-wise) variances.
arXiv Detail & Related papers (2021-10-04T17:28:41Z) - Patch-Based Image Restoration using Expectation Propagation [7.7731951589289565]
Monte Carlo techniques can suffer from scalability issues in high-dimensional inference problems such as image restoration.
EP is used here to approximate the posterior distributions using products of multivariate Gaussian densities.
Experiments conducted for denoising, inpainting and deconvolution problems with Gaussian and Poisson noise illustrate the potential benefits of such flexible approximate Bayesian method.
arXiv Detail & Related papers (2021-06-18T10:45:15Z) - Sampling-free Variational Inference for Neural Networks with
Multiplicative Activation Noise [51.080620762639434]
We propose a more efficient parameterization of the posterior approximation for sampling-free variational inference.
Our approach yields competitive results for standard regression problems and scales well to large-scale image classification tasks.
arXiv Detail & Related papers (2021-03-15T16:16:18Z) - Bayesian Uncertainty Estimation of Learned Variational MRI
Reconstruction [63.202627467245584]
We introduce a Bayesian variational framework to quantify the model-immanent (epistemic) uncertainty.
We demonstrate that our approach yields competitive results for undersampled MRI reconstruction.
arXiv Detail & Related papers (2021-02-12T18:08:14Z) - Amortized Conditional Normalized Maximum Likelihood: Reliable Out of
Distribution Uncertainty Estimation [99.92568326314667]
We propose the amortized conditional normalized maximum likelihood (ACNML) method as a scalable general-purpose approach for uncertainty estimation.
Our algorithm builds on the conditional normalized maximum likelihood (CNML) coding scheme, which has minimax optimal properties according to the minimum description length principle.
We demonstrate that ACNML compares favorably to a number of prior techniques for uncertainty estimation in terms of calibration on out-of-distribution inputs.
arXiv Detail & Related papers (2020-11-05T08:04:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.