Proximal Residual Flows for Bayesian Inverse Problems
- URL: http://arxiv.org/abs/2211.17158v1
- Date: Wed, 30 Nov 2022 16:49:49 GMT
- Title: Proximal Residual Flows for Bayesian Inverse Problems
- Authors: Johannes Hertrich
- Abstract summary: We introduce proximal residual flows, a new architecture of normalizing flows.
We ensure invertibility of certain residual blocks and extend the architecture to conditional residual flows.
We demonstrate the performance of proximal residual flows on numerical examples.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Normalizing flows are a powerful tool for generative modelling, density
estimation and posterior reconstruction in Bayesian inverse problems. In this
paper, we introduce proximal residual flows, a new architecture of normalizing
flows. Based on the fact, that proximal neural networks are by definition
averaged operators, we ensure invertibility of certain residual blocks.
Moreover, we extend the architecture to conditional proximal residual flows for
posterior reconstruction within Bayesian inverse problems. We demonstrate the
performance of proximal residual flows on numerical examples.
Related papers
- Conditional diffusions for neural posterior estimation [8.37884129644711]
We show the effectiveness of conditional diffusions as an alternative to normalizing flows for NPE.
Our results show improved stability, superior accuracy, and faster training times, even with simpler, shallower models.
arXiv Detail & Related papers (2024-10-24T19:13:13Z) - Benign Overfitting for Regression with Trained Two-Layer ReLU Networks [14.36840959836957]
We study the least-square regression problem with a two-layer fully-connected neural network, with ReLU activation function, trained by gradient flow.
Our first result is a generalization result, that requires no assumptions on the underlying regression function or the noise other than that they are bounded.
arXiv Detail & Related papers (2024-10-08T16:54:23Z) - Improving Diffusion Models for Inverse Problems Using Optimal Posterior Covariance [52.093434664236014]
Recent diffusion models provide a promising zero-shot solution to noisy linear inverse problems without retraining for specific inverse problems.
Inspired by this finding, we propose to improve recent methods by using more principled covariance determined by maximum likelihood estimation.
arXiv Detail & Related papers (2024-02-03T13:35:39Z) - AccFlow: Backward Accumulation for Long-Range Optical Flow [70.4251045372285]
This paper proposes a novel recurrent framework called AccFlow for long-range optical flow estimation.
We demonstrate the superiority of backward accumulation over conventional forward accumulation.
Experiments validate the effectiveness of AccFlow in handling long-range optical flow estimation.
arXiv Detail & Related papers (2023-08-25T01:51:26Z) - Deep Variational Inverse Scattering [18.598311270757527]
Inverse medium scattering solvers generally reconstruct a single solution without an associated measure of uncertainty.
Deep networks such as conditional normalizing flows can be used to sample posteriors in inverse problems.
We propose U-Flow, a Bayesian U-Net based on conditional normalizing flows, which generates high-quality posterior samples and estimates physically-meaningful uncertainty.
arXiv Detail & Related papers (2022-12-08T14:57:06Z) - Variational Laplace Autoencoders [53.08170674326728]
Variational autoencoders employ an amortized inference model to approximate the posterior of latent variables.
We present a novel approach that addresses the limited posterior expressiveness of fully-factorized Gaussian assumption.
We also present a general framework named Variational Laplace Autoencoders (VLAEs) for training deep generative models.
arXiv Detail & Related papers (2022-11-30T18:59:27Z) - Bayesian Recurrent Units and the Forward-Backward Algorithm [91.39701446828144]
Using Bayes's theorem, we derive a unit-wise recurrence as well as a backward recursion similar to the forward-backward algorithm.
The resulting Bayesian recurrent units can be integrated as recurrent neural networks within deep learning frameworks.
Experiments on speech recognition indicate that adding the derived units at the end of state-of-the-art recurrent architectures can improve the performance at a very low cost in terms of trainable parameters.
arXiv Detail & Related papers (2022-07-21T14:00:52Z) - Deep Equilibrium Optical Flow Estimation [80.80992684796566]
Recent state-of-the-art (SOTA) optical flow models use finite-step recurrent update operations to emulate traditional algorithms.
These RNNs impose large computation and memory overheads, and are not directly trained to model such stable estimation.
We propose deep equilibrium (DEQ) flow estimators, an approach that directly solves for the flow as the infinite-level fixed point of an implicit layer.
arXiv Detail & Related papers (2022-04-18T17:53:44Z) - Universal Approximation of Residual Flows in Maximum Mean Discrepancy [24.493721984271566]
We study residual flows, a class of normalizing flows composed of Lipschitz residual blocks.
We prove residual flows are universal approximators in maximum mean discrepancy.
arXiv Detail & Related papers (2021-03-10T00:16:33Z) - Quasi-Autoregressive Residual (QuAR) Flows [0.0]
We introduce a simplification to residual flows using a Quasi-Autoregressive (QuAR) approach.
Compared to the standard residual flow approach, this simplification retains many of the benefits of residual flows while dramatically reducing the compute time and memory requirements.
arXiv Detail & Related papers (2020-09-16T01:56:24Z) - Solving Inverse Problems with a Flow-based Noise Model [100.18560761392692]
We study image inverse problems with a normalizing flow prior.
Our formulation views the solution as the maximum a posteriori estimate of the image conditioned on the measurements.
We empirically validate the efficacy of our method on various inverse problems, including compressed sensing with quantized measurements and denoising with highly structured noise patterns.
arXiv Detail & Related papers (2020-03-18T08:33:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.