The efficacy and generalizability of conditional GANs for posterior
inference in physics-based inverse problems
- URL: http://arxiv.org/abs/2202.07773v1
- Date: Tue, 15 Feb 2022 22:57:05 GMT
- Title: The efficacy and generalizability of conditional GANs for posterior
inference in physics-based inverse problems
- Authors: Deep Ray, Harisankar Ramaswamy, Dhruv V. Patel, Assad A. Oberai
- Abstract summary: We train conditional Wasserstein generative adversarial networks to effectively sample from the posterior of physics-based Bayesian inference problems.
We show the generator can learn inverse maps which are local in nature, which in turn promotes generalizability when testing with out-of-distribution samples.
- Score: 0.4588028371034407
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this work, we train conditional Wasserstein generative adversarial
networks to effectively sample from the posterior of physics-based Bayesian
inference problems. The generator is constructed using a U-Net architecture,
with the latent information injected using conditional instance normalization.
The former facilitates a multiscale inverse map, while the latter enables the
decoupling of the latent space dimension from the dimension of the measurement,
and introduces stochasticity at all scales of the U-Net. We solve PDE-based
inverse problems to demonstrate the performance of our approach in quantifying
the uncertainty in the inferred field. Further, we show the generator can learn
inverse maps which are local in nature, which in turn promotes generalizability
when testing with out-of-distribution samples.
Related papers
- Solving High-dimensional Inverse Problems Using Amortized Likelihood-free Inference with Noisy and Incomplete Data [43.43717668587333]
We present a likelihood-free probabilistic inversion method based on normalizing flows for high-dimensional inverse problems.
The proposed method is composed of two complementary networks: a summary network for data compression and an inference network for parameter estimation.
We apply the proposed method to an inversion problem in groundwater hydrology to estimate the posterior distribution of the log-conductivity field conditioned on spatially sparse time-series observations.
arXiv Detail & Related papers (2024-12-05T19:13:17Z) - Conditional score-based diffusion models for solving inverse problems in mechanics [6.319616423658121]
We propose a framework to perform Bayesian inference using conditional score-based diffusion models.
Conditional score-based diffusion models are generative models that learn to approximate the score function of a conditional distribution.
We demonstrate the efficacy of the proposed approach on a suite of high-dimensional inverse problems in mechanics.
arXiv Detail & Related papers (2024-06-19T02:09:15Z) - Deep Variational Inverse Scattering [18.598311270757527]
Inverse medium scattering solvers generally reconstruct a single solution without an associated measure of uncertainty.
Deep networks such as conditional normalizing flows can be used to sample posteriors in inverse problems.
We propose U-Flow, a Bayesian U-Net based on conditional normalizing flows, which generates high-quality posterior samples and estimates physically-meaningful uncertainty.
arXiv Detail & Related papers (2022-12-08T14:57:06Z) - Semi-supervised Invertible DeepONets for Bayesian Inverse Problems [8.594140167290098]
DeepONets offer a powerful, data-driven tool for solving parametric PDEs by learning operators.
In this work, we employ physics-informed DeepONets in the context of high-dimensional, Bayesian inverse problems.
arXiv Detail & Related papers (2022-09-06T18:55:06Z) - On the Double Descent of Random Features Models Trained with SGD [78.0918823643911]
We study properties of random features (RF) regression in high dimensions optimized by gradient descent (SGD)
We derive precise non-asymptotic error bounds of RF regression under both constant and adaptive step-size SGD setting.
We observe the double descent phenomenon both theoretically and empirically.
arXiv Detail & Related papers (2021-10-13T17:47:39Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Efficient Semi-Implicit Variational Inference [65.07058307271329]
We propose an efficient and scalable semi-implicit extrapolational (SIVI)
Our method maps SIVI's evidence to a rigorous inference of lower gradient values.
arXiv Detail & Related papers (2021-01-15T11:39:09Z) - GANs with Variational Entropy Regularizers: Applications in Mitigating
the Mode-Collapse Issue [95.23775347605923]
Building on the success of deep learning, Generative Adversarial Networks (GANs) provide a modern approach to learn a probability distribution from observed samples.
GANs often suffer from the mode collapse issue where the generator fails to capture all existing modes of the input distribution.
We take an information-theoretic approach and maximize a variational lower bound on the entropy of the generated samples to increase their diversity.
arXiv Detail & Related papers (2020-09-24T19:34:37Z) - On the Convergence Rate of Projected Gradient Descent for a
Back-Projection based Objective [58.33065918353532]
We consider a back-projection based fidelity term as an alternative to the common least squares (LS)
We show that using the BP term, rather than the LS term, requires fewer iterations of optimization algorithms.
arXiv Detail & Related papers (2020-05-03T00:58:23Z) - Parameterizing uncertainty by deep invertible networks, an application
to reservoir characterization [0.9176056742068814]
Uncertainty quantification for full-waveform inversion provides a probabilistic characterization of the ill-conditioning of the problem.
We propose an approach characterized by training a deep network that "pushes forward" Gaussian random inputs into the model space as if they were sampled from the actual posterior distribution.
arXiv Detail & Related papers (2020-04-16T18:37:56Z) - Solving Inverse Problems with a Flow-based Noise Model [100.18560761392692]
We study image inverse problems with a normalizing flow prior.
Our formulation views the solution as the maximum a posteriori estimate of the image conditioned on the measurements.
We empirically validate the efficacy of our method on various inverse problems, including compressed sensing with quantized measurements and denoising with highly structured noise patterns.
arXiv Detail & Related papers (2020-03-18T08:33:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.