Flow Annealed Kalman Inversion for Gradient-Free Inference in Bayesian
Inverse Problems
- URL: http://arxiv.org/abs/2309.11490v1
- Date: Wed, 20 Sep 2023 17:39:14 GMT
- Title: Flow Annealed Kalman Inversion for Gradient-Free Inference in Bayesian
Inverse Problems
- Authors: Richard D.P. Grumitt, Minas Karamanis and Uro\v{s} Seljak
- Abstract summary: Flow Annealed Kalman Inversion (FAKI) is a generalization of Ensemble Kalman Inversion (EKI)
We demonstrate the performance of FAKI on two numerical benchmarks.
- Score: 1.534667887016089
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: For many scientific inverse problems we are required to evaluate an expensive
forward model. Moreover, the model is often given in such a form that it is
unrealistic to access its gradients. In such a scenario, standard Markov Chain
Monte Carlo algorithms quickly become impractical, requiring a large number of
serial model evaluations to converge on the target distribution. In this paper
we introduce Flow Annealed Kalman Inversion (FAKI). This is a generalization of
Ensemble Kalman Inversion (EKI), where we embed the Kalman filter updates in a
temperature annealing scheme, and use normalizing flows (NF) to map the
intermediate measures corresponding to each temperature level to the standard
Gaussian. In doing so, we relax the Gaussian ansatz for the intermediate
measures used in standard EKI, allowing us to achieve higher fidelity
approximations to non-Gaussian targets. We demonstrate the performance of FAKI
on two numerical benchmarks, showing dramatic improvements over standard EKI in
terms of accuracy whilst accelerating its already rapid convergence properties
(typically in $\mathcal{O}(10)$ steps).
Related papers
- Sequential Kalman Monte Carlo for gradient-free inference in Bayesian inverse problems [1.3654846342364308]
We introduce Sequential Kalman Monte Carlo samplers to perform gradient-free inference in inverse problems.
FAKI employs normalizing flows to relax the Gaussian ansatz of the target measures in EKI.
FAKI alone is not able to correct for the model linearity assumptions in EKI.
arXiv Detail & Related papers (2024-07-10T15:56:30Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Variational Kalman Filtering with Hinf-Based Correction for Robust
Bayesian Learning in High Dimensions [2.294014185517203]
We address the problem of convergence of sequential variational inference filter (VIF) through the application of a robust variational objective and Hinf-norm based correction.
A novel VIF- Hinf recursion that employs consecutive variational inference and Hinf based optimization steps is proposed.
arXiv Detail & Related papers (2022-04-27T17:38:13Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Gaussian MRF Covariance Modeling for Efficient Black-Box Adversarial
Attacks [86.88061841975482]
We study the problem of generating adversarial examples in a black-box setting, where we only have access to a zeroth order oracle.
We use this setting to find fast one-step adversarial attacks, akin to a black-box version of the Fast Gradient Sign Method(FGSM)
We show that the method uses fewer queries and achieves higher attack success rates than the current state of the art.
arXiv Detail & Related papers (2020-10-08T18:36:51Z) - An adaptive Hessian approximated stochastic gradient MCMC method [12.93317525451798]
We present an adaptive Hessian approximated gradient MCMC method to incorporate local geometric information while sampling from the posterior.
We adopt a magnitude-based weight pruning method to enforce the sparsity of the network.
arXiv Detail & Related papers (2020-10-03T16:22:15Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.