VI-DGP: A variational inference method with deep generative prior for
solving high-dimensional inverse problems
- URL: http://arxiv.org/abs/2302.11173v1
- Date: Wed, 22 Feb 2023 06:48:10 GMT
- Title: VI-DGP: A variational inference method with deep generative prior for
solving high-dimensional inverse problems
- Authors: Yingzhi Xia, Qifeng Liao, Jinglai Li
- Abstract summary: We propose a novel approximation method for estimating the high-dimensional posterior distribution.
This approach leverages a deep generative model to learn a prior model capable of generating spatially-varying parameters.
The proposed method can be fully implemented in an automatic differentiation manner.
- Score: 0.7734726150561089
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Solving high-dimensional Bayesian inverse problems (BIPs) with the
variational inference (VI) method is promising but still challenging. The main
difficulties arise from two aspects. First, VI methods approximate the
posterior distribution using a simple and analytic variational distribution,
which makes it difficult to estimate complex spatially-varying parameters in
practice. Second, VI methods typically rely on gradient-based optimization,
which can be computationally expensive or intractable when applied to BIPs
involving partial differential equations (PDEs). To address these challenges,
we propose a novel approximation method for estimating the high-dimensional
posterior distribution. This approach leverages a deep generative model to
learn a prior model capable of generating spatially-varying parameters. This
enables posterior approximation over the latent variable instead of the complex
parameters, thus improving estimation accuracy. Moreover, to accelerate
gradient computation, we employ a differentiable physics-constrained surrogate
model to replace the adjoint method. The proposed method can be fully
implemented in an automatic differentiation manner. Numerical examples
demonstrate two types of log-permeability estimation for flow in heterogeneous
media. The results show the validity, accuracy, and high efficiency of the
proposed method.
Related papers
- Annealed Stein Variational Gradient Descent for Improved Uncertainty Estimation in Full-Waveform Inversion [25.714206592953545]
Variational Inference (VI) provides an approximate solution to the posterior distribution in the form of a parametric or non-parametric proposal distribution.
This study aims to improve the performance of VI within the context of Full-Waveform Inversion.
arXiv Detail & Related papers (2024-10-17T06:15:26Z) - Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - Nonparametric Automatic Differentiation Variational Inference with
Spline Approximation [7.5620760132717795]
We develop a nonparametric approximation approach that enables flexible posterior approximation for distributions with complicated structures.
Compared with widely-used nonparametrical inference methods, the proposed method is easy to implement and adaptive to various data structures.
Experiments demonstrate the efficiency of the proposed method in approximating complex posterior distributions and improving the performance of generative models with incomplete data.
arXiv Detail & Related papers (2024-03-10T20:22:06Z) - Improving Diffusion Models for Inverse Problems Using Optimal Posterior Covariance [52.093434664236014]
Recent diffusion models provide a promising zero-shot solution to noisy linear inverse problems without retraining for specific inverse problems.
Inspired by this finding, we propose to improve recent methods by using more principled covariance determined by maximum likelihood estimation.
arXiv Detail & Related papers (2024-02-03T13:35:39Z) - Variational Gaussian Process Diffusion Processes [17.716059928867345]
Diffusion processes are a class of differential equations (SDEs) providing a rich family of expressive models.
Probabilistic inference and learning under generative models with latent processes endowed with a non-linear diffusion process prior are intractable problems.
We build upon work within variational inference, approximating the posterior process as a linear diffusion process, and point out pathologies in the approach.
arXiv Detail & Related papers (2023-06-03T09:43:59Z) - Variational Laplace Autoencoders [53.08170674326728]
Variational autoencoders employ an amortized inference model to approximate the posterior of latent variables.
We present a novel approach that addresses the limited posterior expressiveness of fully-factorized Gaussian assumption.
We also present a general framework named Variational Laplace Autoencoders (VLAEs) for training deep generative models.
arXiv Detail & Related papers (2022-11-30T18:59:27Z) - Manifold Gaussian Variational Bayes on the Precision Matrix [70.44024861252554]
We propose an optimization algorithm for Variational Inference (VI) in complex models.
We develop an efficient algorithm for Gaussian Variational Inference whose updates satisfy the positive definite constraint on the variational covariance matrix.
Due to its black-box nature, MGVBP stands as a ready-to-use solution for VI in complex models.
arXiv Detail & Related papers (2022-10-26T10:12:31Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - A Variational Inference Approach to Inverse Problems with Gamma
Hyperpriors [60.489902135153415]
This paper introduces a variational iterative alternating scheme for hierarchical inverse problems with gamma hyperpriors.
The proposed variational inference approach yields accurate reconstruction, provides meaningful uncertainty quantification, and is easy to implement.
arXiv Detail & Related papers (2021-11-26T06:33:29Z) - Manifold learning-based polynomial chaos expansions for high-dimensional
surrogate models [0.0]
We introduce a manifold learning-based method for uncertainty quantification (UQ) in describing systems.
The proposed method is able to achieve highly accurate approximations which ultimately lead to the significant acceleration of UQ tasks.
arXiv Detail & Related papers (2021-07-21T00:24:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.