A domain-decomposed VAE method for Bayesian inverse problems
- URL: http://arxiv.org/abs/2301.05708v1
- Date: Mon, 9 Jan 2023 07:35:43 GMT
- Title: A domain-decomposed VAE method for Bayesian inverse problems
- Authors: Xu Zhihang, Xia Yingzhi, Liao Qifeng
- Abstract summary: This paper proposes a domain-decomposed variational auto-encoder Markov chain Monte Carlo (DD-VAE-MCMC) method to tackle these challenges simultaneously.
The proposed method first constructs local deterministic generative models based on local historical data, which provide efficient local prior representations.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Bayesian inverse problems are often computationally challenging when the
forward model is governed by complex partial differential equations (PDEs).
This is typically caused by expensive forward model evaluations and
high-dimensional parameterization of priors. This paper proposes a
domain-decomposed variational auto-encoder Markov chain Monte Carlo
(DD-VAE-MCMC) method to tackle these challenges simultaneously. Through
partitioning the global physical domain into small subdomains, the proposed
method first constructs local deterministic generative models based on local
historical data, which provide efficient local prior representations. Gaussian
process models with active learning address the domain decomposition interface
conditions. Then inversions are conducted on each subdomain independently in
parallel and in low-dimensional latent parameter spaces. The local inference
solutions are post-processed through the Poisson image blending procedure to
result in an efficient global inference result. Numerical examples are provided
to demonstrate the performance of the proposed method.
Related papers
- Boundless Across Domains: A New Paradigm of Adaptive Feature and Cross-Attention for Domain Generalization in Medical Image Segmentation [1.93061220186624]
Domain-invariant representation learning is a powerful method for domain generalization.
Previous approaches face challenges such as high computational demands, training instability, and limited effectiveness with high-dimensional data.
We propose an Adaptive Feature Blending (AFB) method that generates out-of-distribution samples while exploring the in-distribution space.
arXiv Detail & Related papers (2024-11-22T12:06:24Z) - Non-overlapping, Schwarz-type Domain Decomposition Method for Physics and Equality Constrained Artificial Neural Networks [0.24578723416255746]
We present a non-overlapping, Schwarz-type domain decomposition method with a generalized interface condition.
Our approach employs physics and equality-constrained artificial neural networks (PECANN) within each subdomain.
A distinct advantage our domain decomposition method is its ability to learn solutions to both Poisson's and Helmholtz equations.
arXiv Detail & Related papers (2024-09-20T16:48:55Z) - Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - VI-DGP: A variational inference method with deep generative prior for
solving high-dimensional inverse problems [0.7734726150561089]
We propose a novel approximation method for estimating the high-dimensional posterior distribution.
This approach leverages a deep generative model to learn a prior model capable of generating spatially-varying parameters.
The proposed method can be fully implemented in an automatic differentiation manner.
arXiv Detail & Related papers (2023-02-22T06:48:10Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Inverse Models for Estimating the Initial Condition of Spatio-Temporal
Advection-Diffusion Processes [5.814371485767541]
Inverse problems involve making inference about unknown parameters of a physical process using observational data.
This paper investigates the estimation of the initial condition of a-temporal advection-diffusion process using spatially sparse data streams.
arXiv Detail & Related papers (2023-02-08T15:30:16Z) - Variational Laplace Autoencoders [53.08170674326728]
Variational autoencoders employ an amortized inference model to approximate the posterior of latent variables.
We present a novel approach that addresses the limited posterior expressiveness of fully-factorized Gaussian assumption.
We also present a general framework named Variational Laplace Autoencoders (VLAEs) for training deep generative models.
arXiv Detail & Related papers (2022-11-30T18:59:27Z) - Super-model ecosystem: A domain-adaptation perspective [101.76769818069072]
This paper attempts to establish the theoretical foundation for the emerging super-model paradigm via domain adaptation.
Super-model paradigms help reduce computational and data cost and carbon emission, which is critical to AI industry.
arXiv Detail & Related papers (2022-08-30T09:09:43Z) - Model-Based Domain Generalization [96.84818110323518]
We propose a novel approach for the domain generalization problem called Model-Based Domain Generalization.
Our algorithms beat the current state-of-the-art methods on the very-recently-proposed WILDS benchmark by up to 20 percentage points.
arXiv Detail & Related papers (2021-02-23T00:59:02Z) - Distributed Variational Bayesian Algorithms Over Sensor Networks [6.572330981878818]
We propose two novel distributed VB algorithms for general Bayesian inference problem.
The proposed algorithms have excellent performance, which are almost as good as the corresponding centralized VB algorithm relying on all data available in a fusion center.
arXiv Detail & Related papers (2020-11-27T08:12:18Z) - Model Fusion with Kullback--Leibler Divergence [58.20269014662046]
We propose a method to fuse posterior distributions learned from heterogeneous datasets.
Our algorithm relies on a mean field assumption for both the fused model and the individual dataset posteriors.
arXiv Detail & Related papers (2020-07-13T03:27:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.