VI-DGP: A variational inference method with deep generative prior for
solving high-dimensional inverse problems
- URL: http://arxiv.org/abs/2302.11173v1
- Date: Wed, 22 Feb 2023 06:48:10 GMT
- Title: VI-DGP: A variational inference method with deep generative prior for
solving high-dimensional inverse problems
- Authors: Yingzhi Xia, Qifeng Liao, Jinglai Li
- Abstract summary: We propose a novel approximation method for estimating the high-dimensional posterior distribution.
This approach leverages a deep generative model to learn a prior model capable of generating spatially-varying parameters.
The proposed method can be fully implemented in an automatic differentiation manner.
- Score: 0.7734726150561089
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Solving high-dimensional Bayesian inverse problems (BIPs) with the
variational inference (VI) method is promising but still challenging. The main
difficulties arise from two aspects. First, VI methods approximate the
posterior distribution using a simple and analytic variational distribution,
which makes it difficult to estimate complex spatially-varying parameters in
practice. Second, VI methods typically rely on gradient-based optimization,
which can be computationally expensive or intractable when applied to BIPs
involving partial differential equations (PDEs). To address these challenges,
we propose a novel approximation method for estimating the high-dimensional
posterior distribution. This approach leverages a deep generative model to
learn a prior model capable of generating spatially-varying parameters. This
enables posterior approximation over the latent variable instead of the complex
parameters, thus improving estimation accuracy. Moreover, to accelerate
gradient computation, we employ a differentiable physics-constrained surrogate
model to replace the adjoint method. The proposed method can be fully
implemented in an automatic differentiation manner. Numerical examples
demonstrate two types of log-permeability estimation for flow in heterogeneous
media. The results show the validity, accuracy, and high efficiency of the
proposed method.
Related papers
- ODE-DPS: ODE-based Diffusion Posterior Sampling for Inverse Problems in Partial Differential Equation [1.8356973269166506]
We introduce a novel unsupervised inversion methodology tailored for solving inverse problems arising from PDEs.
Our approach operates within the Bayesian inversion framework, treating the task of solving the posterior distribution as a conditional generation process.
To enhance the accuracy of inversion results, we propose an ODE-based Diffusion inversion algorithm.
arXiv Detail & Related papers (2024-04-21T00:57:13Z) - Nonparametric Automatic Differentiation Variational Inference with
Spline Approximation [7.5620760132717795]
We develop a nonparametric approximation approach that enables flexible posterior approximation for distributions with complicated structures.
Compared with widely-used nonparametrical inference methods, the proposed method is easy to implement and adaptive to various data structures.
Experiments demonstrate the efficiency of the proposed method in approximating complex posterior distributions and improving the performance of generative models with incomplete data.
arXiv Detail & Related papers (2024-03-10T20:22:06Z) - Improving Diffusion Models for Inverse Problems Using Optimal Posterior Covariance [52.093434664236014]
Recent diffusion models provide a promising zero-shot solution to noisy linear inverse problems without retraining for specific inverse problems.
Inspired by this finding, we propose to improve recent methods by using more principled covariance determined by maximum likelihood estimation.
arXiv Detail & Related papers (2024-02-03T13:35:39Z) - Variational Gaussian Process Diffusion Processes [17.716059928867345]
Diffusion processes are a class of differential equations (SDEs) providing a rich family of expressive models.
Probabilistic inference and learning under generative models with latent processes endowed with a non-linear diffusion process prior are intractable problems.
We build upon work within variational inference, approximating the posterior process as a linear diffusion process, and point out pathologies in the approach.
arXiv Detail & Related papers (2023-06-03T09:43:59Z) - Variational Laplace Autoencoders [53.08170674326728]
Variational autoencoders employ an amortized inference model to approximate the posterior of latent variables.
We present a novel approach that addresses the limited posterior expressiveness of fully-factorized Gaussian assumption.
We also present a general framework named Variational Laplace Autoencoders (VLAEs) for training deep generative models.
arXiv Detail & Related papers (2022-11-30T18:59:27Z) - Manifold Gaussian Variational Bayes on the Precision Matrix [70.44024861252554]
We propose an optimization algorithm for Variational Inference (VI) in complex models.
We develop an efficient algorithm for Gaussian Variational Inference whose updates satisfy the positive definite constraint on the variational covariance matrix.
Due to its black-box nature, MGVBP stands as a ready-to-use solution for VI in complex models.
arXiv Detail & Related papers (2022-10-26T10:12:31Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - A Variational Inference Approach to Inverse Problems with Gamma
Hyperpriors [60.489902135153415]
This paper introduces a variational iterative alternating scheme for hierarchical inverse problems with gamma hyperpriors.
The proposed variational inference approach yields accurate reconstruction, provides meaningful uncertainty quantification, and is easy to implement.
arXiv Detail & Related papers (2021-11-26T06:33:29Z) - Manifold learning-based polynomial chaos expansions for high-dimensional
surrogate models [0.0]
We introduce a manifold learning-based method for uncertainty quantification (UQ) in describing systems.
The proposed method is able to achieve highly accurate approximations which ultimately lead to the significant acceleration of UQ tasks.
arXiv Detail & Related papers (2021-07-21T00:24:15Z) - Variational Nonlinear System Identification [0.8793721044482611]
This paper considers parameter estimation for nonlinear state-space models, which is an important but challenging problem.
We employ a variational inference (VI) approach, which is a principled method that has deep connections to maximum likelihood estimation.
This VI approach ultimately provides estimates of the model as solutions to an optimisation problem, which is deterministic, tractable and can be solved using standard optimisation tools.
arXiv Detail & Related papers (2020-12-08T05:43:50Z) - Cogradient Descent for Bilinear Optimization [124.45816011848096]
We introduce a Cogradient Descent algorithm (CoGD) to address the bilinear problem.
We solve one variable by considering its coupling relationship with the other, leading to a synchronous gradient descent.
Our algorithm is applied to solve problems with one variable under the sparsity constraint.
arXiv Detail & Related papers (2020-06-16T13:41:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.