Gaussian processes for Bayesian inverse problems associated with linear
partial differential equations
- URL: http://arxiv.org/abs/2307.08343v1
- Date: Mon, 17 Jul 2023 09:31:26 GMT
- Title: Gaussian processes for Bayesian inverse problems associated with linear
partial differential equations
- Authors: Tianming Bai, Aretha L. Teckentrup, Konstantinos C. Zygalakis
- Abstract summary: This work is concerned with the use of Gaussian surrogate models for inverse problems associated with linear partial differential equations.
The type of Gaussian prior used is of critical importance with respect to how well the surrogate model will perform in terms of Bayesian inversion.
A number of different experiments illustrate the superiority of the PDE-informed Gaussian priors over more traditional priors.
- Score: 0.8379286663107844
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work is concerned with the use of Gaussian surrogate models for Bayesian
inverse problems associated with linear partial differential equations. A
particular focus is on the regime where only a small amount of training data is
available. In this regime the type of Gaussian prior used is of critical
importance with respect to how well the surrogate model will perform in terms
of Bayesian inversion. We extend the framework of Raissi et. al. (2017) to
construct PDE-informed Gaussian priors that we then use to construct different
approximate posteriors. A number of different numerical experiments illustrate
the superiority of the PDE-informed Gaussian priors over more traditional
priors.
Related papers
- Error Analysis of Bayesian Inverse Problems with Generative Priors [9.276062058338443]
We present an analysis for such problems by presenting quantitative error bounds for minimum Wasserstein-2 generative models for the prior.<n>We show that under some assumptions, the error in the posterior due to the generative prior will inherit the same rate as the prior with respect to the Wasserstein-1 distance.
arXiv Detail & Related papers (2026-01-24T08:45:27Z) - Blade: A Derivative-free Bayesian Inversion Method using Diffusion Priors [27.491109854890492]
We introduce Blade, which can produce accurate and well-calibrated posteriors for Bayesian inversion using an ensemble of interacting particles.<n>Blade achieves superior performance compared to existing derivative-free Bayesian inversion methods on various inverse problems.
arXiv Detail & Related papers (2025-10-13T03:19:44Z) - MGF: Mixed Gaussian Flow for Diverse Trajectory Prediction [72.70572835589158]
We propose constructing a mixed Gaussian prior for a normalizing flow model for trajectory prediction.
Our method achieves state-of-the-art performance in the evaluation of both trajectory alignment and diversity on the popular UCY/ETH and SDD datasets.
arXiv Detail & Related papers (2024-02-19T15:48:55Z) - Improving Diffusion Models for Inverse Problems Using Optimal Posterior Covariance [52.093434664236014]
Recent diffusion models provide a promising zero-shot solution to noisy linear inverse problems without retraining for specific inverse problems.
Inspired by this finding, we propose to improve recent methods by using more principled covariance determined by maximum likelihood estimation.
arXiv Detail & Related papers (2024-02-03T13:35:39Z) - GibbsDDRM: A Partially Collapsed Gibbs Sampler for Solving Blind Inverse
Problems with Denoising Diffusion Restoration [64.8770356696056]
We propose GibbsDDRM, an extension of Denoising Diffusion Restoration Models (DDRM) to a blind setting in which the linear measurement operator is unknown.
The proposed method is problem-agnostic, meaning that a pre-trained diffusion model can be applied to various inverse problems without fine-tuning.
arXiv Detail & Related papers (2023-01-30T06:27:48Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - The Schr\"odinger Bridge between Gaussian Measures has a Closed Form [101.79851806388699]
We focus on the dynamic formulation of OT, also known as the Schr"odinger bridge (SB) problem.
In this paper, we provide closed-form expressions for SBs between Gaussian measures.
arXiv Detail & Related papers (2022-02-11T15:59:01Z) - A Variational Inference Approach to Inverse Problems with Gamma
Hyperpriors [60.489902135153415]
This paper introduces a variational iterative alternating scheme for hierarchical inverse problems with gamma hyperpriors.
The proposed variational inference approach yields accurate reconstruction, provides meaningful uncertainty quantification, and is easy to implement.
arXiv Detail & Related papers (2021-11-26T06:33:29Z) - Bayesian inference of ODEs with Gaussian processes [17.138448665454373]
We propose a novel Bayesian nonparametric model that uses Gaussian processes to infer posteriors of unknown ODE systems directly from data.
We derive sparse variational inference with decoupled functional sampling to represent vector field posteriors.
The method demonstrates the benefit of computing vector field posteriors, with predictive uncertainty scores outperforming alternative methods on multiple ODE learning tasks.
arXiv Detail & Related papers (2021-06-21T08:09:17Z) - Connecting Hamilton--Jacobi partial differential equations with maximum
a posteriori and posterior mean estimators for some non-convex priors [0.0]
In this chapter, we consider a certain class non-log-concave regularizations and show that similar representation formulas for the minimizer can also be obtained.
We also present similar results for certain Bayesian posterior mean estimators with Gaussian data fidelity and certain non-log-concave priors using an analogue of min-plus algebra techniques.
arXiv Detail & Related papers (2021-04-22T19:00:37Z) - Mixtures of Gaussian Processes for regression under multiple prior
distributions [0.0]
We extend the idea of Mixture models for Gaussian Process regression in order to work with multiple prior beliefs at once.
We consider the usage of our approach to additionally account for the problem of prior misspecification in functional regression problems.
arXiv Detail & Related papers (2021-04-19T10:19:14Z) - Improving predictions of Bayesian neural nets via local linearization [79.21517734364093]
We argue that the Gauss-Newton approximation should be understood as a local linearization of the underlying Bayesian neural network (BNN)
Because we use this linearized model for posterior inference, we should also predict using this modified model instead of the original one.
We refer to this modified predictive as "GLM predictive" and show that it effectively resolves common underfitting problems of the Laplace approximation.
arXiv Detail & Related papers (2020-08-19T12:35:55Z) - Variational Autoencoders with Riemannian Brownian Motion Priors [4.8461049669050915]
Variational Autoencoders (VAEs) represent the given data in a low-dimensional latent space, which is generally assumed to be Euclidean.
Recent work has shown that this prior has a detrimental effect on model capacity, leading to subpar performance.
We propose an efficient inference scheme that does not rely on the unknown normalizing factor of this prior.
arXiv Detail & Related papers (2020-02-12T20:35:21Z) - Linearly Constrained Gaussian Processes with Boundary Conditions [5.33024001730262]
We consider prior knowledge from systems of linear partial differential equations together with their boundary conditions.
We construct multi-output Gaussian process priors with realizations in the solution set of such systems.
arXiv Detail & Related papers (2020-02-03T15:19:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.