Deep Variational Inverse Scattering
- URL: http://arxiv.org/abs/2212.04309v2
- Date: Fri, 9 Dec 2022 11:25:35 GMT
- Title: Deep Variational Inverse Scattering
- Authors: AmirEhsan Khorashadizadeh, Ali Aghababaei, Tin Vla\v{s}i\'c, Hieu
Nguyen, Ivan Dokmani\'c
- Abstract summary: Inverse medium scattering solvers generally reconstruct a single solution without an associated measure of uncertainty.
Deep networks such as conditional normalizing flows can be used to sample posteriors in inverse problems.
We propose U-Flow, a Bayesian U-Net based on conditional normalizing flows, which generates high-quality posterior samples and estimates physically-meaningful uncertainty.
- Score: 18.598311270757527
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Inverse medium scattering solvers generally reconstruct a single solution
without an associated measure of uncertainty. This is true both for the
classical iterative solvers and for the emerging deep learning methods. But
ill-posedness and noise can make this single estimate inaccurate or misleading.
While deep networks such as conditional normalizing flows can be used to sample
posteriors in inverse problems, they often yield low-quality samples and
uncertainty estimates. In this paper, we propose U-Flow, a Bayesian U-Net based
on conditional normalizing flows, which generates high-quality posterior
samples and estimates physically-meaningful uncertainty. We show that the
proposed model significantly outperforms the recent normalizing flows in terms
of posterior sample quality while having comparable performance with the U-Net
in point estimation.
Related papers
- Enhancing Diffusion Posterior Sampling for Inverse Problems by Integrating Crafted Measurements [45.70011319850862]
Diffusion models have emerged as a powerful foundation model for visual generation.
Current posterior sampling based methods take the measurement into the posterior sampling to infer the distribution of the target data.
We show that high-frequency information can be prematurely introduced during the early stages, which could induce larger posterior estimate errors.
We propose a novel diffusion posterior sampling method DPS-CM, which incorporates a Crafted Measurement.
arXiv Detail & Related papers (2024-11-15T00:06:57Z) - Error Feedback under $(L_0,L_1)$-Smoothness: Normalization and Momentum [56.37522020675243]
We provide the first proof of convergence for normalized error feedback algorithms across a wide range of machine learning problems.
We show that due to their larger allowable stepsizes, our new normalized error feedback algorithms outperform their non-normalized counterparts on various tasks.
arXiv Detail & Related papers (2024-10-22T10:19:27Z) - With or Without Replacement? Improving Confidence in Fourier Imaging [5.542462410129539]
We show how a transition between sampling with and without replacement can lead to a weighted reconstruction scheme with improved performance for the standard LASSO.
In this paper, we illustrate how this reweighted sampling idea can also improve the debiased estimator.
arXiv Detail & Related papers (2024-07-18T15:15:19Z) - Improving Diffusion Models for Inverse Problems Using Optimal Posterior Covariance [52.093434664236014]
Recent diffusion models provide a promising zero-shot solution to noisy linear inverse problems without retraining for specific inverse problems.
Inspired by this finding, we propose to improve recent methods by using more principled covariance determined by maximum likelihood estimation.
arXiv Detail & Related papers (2024-02-03T13:35:39Z) - GFlowOut: Dropout with Generative Flow Networks [76.59535235717631]
Monte Carlo Dropout has been widely used as a relatively cheap way for approximate Inference.
Recent works show that the dropout mask can be viewed as a latent variable, which can be inferred with variational inference.
GFlowOutleverages the recently proposed probabilistic framework of Generative Flow Networks (GFlowNets) to learn the posterior distribution over dropout masks.
arXiv Detail & Related papers (2022-10-24T03:00:01Z) - Amortized Conditional Normalized Maximum Likelihood: Reliable Out of
Distribution Uncertainty Estimation [99.92568326314667]
We propose the amortized conditional normalized maximum likelihood (ACNML) method as a scalable general-purpose approach for uncertainty estimation.
Our algorithm builds on the conditional normalized maximum likelihood (CNML) coding scheme, which has minimax optimal properties according to the minimum description length principle.
We demonstrate that ACNML compares favorably to a number of prior techniques for uncertainty estimation in terms of calibration on out-of-distribution inputs.
arXiv Detail & Related papers (2020-11-05T08:04:34Z) - Uncertainty Quantification in Deep Residual Neural Networks [0.0]
Uncertainty quantification is an important and challenging problem in deep learning.
Previous methods rely on dropout layers which are not present in modern deep architectures or batch normalization which is sensitive to batch sizes.
We show that training residual networks using depth can be interpreted as a variational approximation to the posterior weights in neural networks.
arXiv Detail & Related papers (2020-07-09T16:05:37Z) - Parameterizing uncertainty by deep invertible networks, an application
to reservoir characterization [0.9176056742068814]
Uncertainty quantification for full-waveform inversion provides a probabilistic characterization of the ill-conditioning of the problem.
We propose an approach characterized by training a deep network that "pushes forward" Gaussian random inputs into the model space as if they were sampled from the actual posterior distribution.
arXiv Detail & Related papers (2020-04-16T18:37:56Z) - Solving Inverse Problems with a Flow-based Noise Model [100.18560761392692]
We study image inverse problems with a normalizing flow prior.
Our formulation views the solution as the maximum a posteriori estimate of the image conditioned on the measurements.
We empirically validate the efficacy of our method on various inverse problems, including compressed sensing with quantized measurements and denoising with highly structured noise patterns.
arXiv Detail & Related papers (2020-03-18T08:33:49Z) - Composing Normalizing Flows for Inverse Problems [89.06155049265641]
We propose a framework for approximate inference that estimates the target conditional as a composition of two flow models.
Our method is evaluated on a variety of inverse problems and is shown to produce high-quality samples with uncertainty.
arXiv Detail & Related papers (2020-02-26T19:01:11Z) - A deep-learning based Bayesian approach to seismic imaging and
uncertainty quantification [0.4588028371034407]
Uncertainty is essential when dealing with ill-conditioned inverse problems.
It is often not possible to formulate a prior distribution that precisely encodes our prior knowledge about the unknown.
We propose to use the functional form of a randomly convolutional neural network as an implicit structured prior.
arXiv Detail & Related papers (2020-01-13T23:46:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.