$floZ$: Improved Bayesian evidence estimation from posterior samples with normalizing flows
- URL: http://arxiv.org/abs/2404.12294v2
- Date: Fri, 7 Jun 2024 09:16:23 GMT
- Title: $floZ$: Improved Bayesian evidence estimation from posterior samples with normalizing flows
- Authors: Rahul Srinivasan, Marco Crisostomi, Roberto Trotta, Enrico Barausse, Matteo Breschi,
- Abstract summary: We introduce $floZ$, an improved method for estimating the Bayesian evidence from a set of samples drawn from the unnormalized posterior distribution.
We validate it on distributions whose evidence is known analytically, up to 15 parameter space dimensions, and compare with two state-of-the-art techniques.
$floZ$ has wide applicability, e.g., to estimate the evidence from variational inference, Markov Chain Monte Carlo samples, or any other method that delivers samples from the unnormalized posterior density.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce $floZ$, an improved method based on normalizing flows, for estimating the Bayesian evidence (and its numerical uncertainty) from a set of samples drawn from the unnormalized posterior distribution. We validate it on distributions whose evidence is known analytically, up to 15 parameter space dimensions, and compare with two state-of-the-art techniques for estimating the evidence: nested sampling (which computes the evidence as its main target) and a $k$-nearest-neighbors technique that produces evidence estimates from posterior samples. Provided representative samples from the target posterior are available, our method is more robust to posterior distributions with sharp features, especially in higher dimensions. For a simple multivariate Gaussian, we demonstrate its accuracy for up to 200 dimensions with $10^5$ posterior samples. $floZ$ has wide applicability, e.g., to estimate the evidence from variational inference, Markov Chain Monte Carlo samples, or any other method that delivers samples from the unnormalized posterior density, such as simulation-based inference. We apply $floZ$ to compute the Bayes factor for the presence of the first overtone in the ringdown signal of the gravitational wave data of GW150914, finding good agreement with nested sampling.
Related papers
- Amortized Posterior Sampling with Diffusion Prior Distillation [55.03585818289934]
We propose a variational inference approach to sample from the posterior distribution for solving inverse problems.
We show that our method is applicable to standard signals in Euclidean space, as well as signals on manifold.
arXiv Detail & Related papers (2024-07-25T09:53:12Z) - Consistency Model is an Effective Posterior Sample Approximation for Diffusion Inverse Solvers [28.678613691787096]
Previous approximations rely on the posterior means, which may not lie in the support of the image distribution.
We introduce a novel approach for posterior approximation that guarantees to generate valid samples within the support of the image distribution.
arXiv Detail & Related papers (2024-02-09T02:23:47Z) - Posterior samples of source galaxies in strong gravitational lenses with
score-based priors [107.52670032376555]
We use a score-based model to encode the prior for the inference of undistorted images of background galaxies.
We show how the balance between the likelihood and the prior meet our expectations in an experiment with out-of-distribution data.
arXiv Detail & Related papers (2022-11-07T19:00:42Z) - DensePure: Understanding Diffusion Models towards Adversarial Robustness [110.84015494617528]
We analyze the properties of diffusion models and establish the conditions under which they can enhance certified robustness.
We propose a new method DensePure, designed to improve the certified robustness of a pretrained model (i.e. a classifier)
We show that this robust region is a union of multiple convex sets, and is potentially much larger than the robust regions identified in previous works.
arXiv Detail & Related papers (2022-11-01T08:18:07Z) - Adversarial Bayesian Simulation [0.9137554315375922]
We bridge approximate Bayesian computation (ABC) with deep neural implicit samplers based on adversarial networks (GANs) and adversarial variational Bayes.
We develop a Bayesian GAN that directly targets the posterior by solving an adversarial optimization problem.
We show that the typical total variation distance between the true and approximate posteriors converges to zero for certain neural network generators and discriminators.
arXiv Detail & Related papers (2022-08-25T14:18:39Z) - Convergence for score-based generative modeling with polynomial
complexity [9.953088581242845]
We prove the first convergence guarantees for the core mechanic behind Score-based generative modeling.
Compared to previous works, we do not incur error that grows exponentially in time or that suffers from a curse of dimensionality.
We show that a predictor-corrector gives better convergence than using either portion alone.
arXiv Detail & Related papers (2022-06-13T14:57:35Z) - Accelerated Bayesian SED Modeling using Amortized Neural Posterior
Estimation [0.0]
We present an alternative scalable approach to rigorous Bayesian inference using Amortized Neural Posterior Estimation (ANPE)
ANPE is a simulation-based inference method that employs neural networks to estimate the posterior probability distribution.
We present, and publicly release, $rm SEDflow$, an ANPE method to produce posteriors of the recent Hahn et al. (2022) SED model from optical photometry.
arXiv Detail & Related papers (2022-03-14T18:00:03Z) - Sensing Cox Processes via Posterior Sampling and Positive Bases [56.82162768921196]
We study adaptive sensing of point processes, a widely used model from spatial statistics.
We model the intensity function as a sample from a truncated Gaussian process, represented in a specially constructed positive basis.
Our adaptive sensing algorithms use Langevin dynamics and are based on posterior sampling (textscCox-Thompson) and top-two posterior sampling (textscTop2) principles.
arXiv Detail & Related papers (2021-10-21T14:47:06Z) - Instance-Optimal Compressed Sensing via Posterior Sampling [101.43899352984774]
We show for Gaussian measurements and emphany prior distribution on the signal, that the posterior sampling estimator achieves near-optimal recovery guarantees.
We implement the posterior sampling estimator for deep generative priors using Langevin dynamics, and empirically find that it produces accurate estimates with more diversity than MAP.
arXiv Detail & Related papers (2021-06-21T22:51:56Z) - The Generalized Lasso with Nonlinear Observations and Generative Priors [63.541900026673055]
We make the assumption of sub-Gaussian measurements, which is satisfied by a wide range of measurement models.
We show that our result can be extended to the uniform recovery guarantee under the assumption of a so-called local embedding property.
arXiv Detail & Related papers (2020-06-22T16:43:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.