Bayesian evidence estimation from posterior samples with normalizing flows
- URL: http://arxiv.org/abs/2404.12294v3
- Date: Thu, 05 Dec 2024 15:27:14 GMT
- Title: Bayesian evidence estimation from posterior samples with normalizing flows
- Authors: Rahul Srinivasan, Marco Crisostomi, Roberto Trotta, Enrico Barausse, Matteo Breschi,
- Abstract summary: We propose a novel method to estimate the Bayesian evidence (and its numerical uncertainty) from a set of samples drawn from the unnormalized posterior distribution.
We validate it on distributions whose evidence is known analytically, up to 15 parameter space dimensions, and compare with two state-of-the-art techniques.
$floZ$ has wide applicability, e.g., to estimate evidence from variational inference, Markov Chain Monte Carlo samples, or any other method that delivers samples and their likelihood from the unnormalized posterior density.
- Score: 0.0
- License:
- Abstract: We propose a novel method ($floZ$), based on normalizing flows, to estimate the Bayesian evidence (and its numerical uncertainty) from a pre-existing set of samples drawn from the unnormalized posterior distribution. We validate it on distributions whose evidence is known analytically, up to 15 parameter space dimensions, and compare with two state-of-the-art techniques for estimating the evidence: nested sampling (which computes the evidence as its main target) and a $k$-nearest-neighbors technique that produces evidence estimates from posterior samples. Provided representative samples from the target posterior are available, our method is more robust to posterior distributions with sharp features, especially in higher dimensions. For a simple multivariate Gaussian, we demonstrate its accuracy for up to 200 dimensions with $10^5$ posterior samples. $floZ$ has wide applicability, e.g., to estimate evidence from variational inference, Markov Chain Monte Carlo samples, or any other method that delivers samples and their likelihood from the unnormalized posterior density. As a physical application, we use $floZ$ to compute the Bayes factor for the presence of the first overtone in the ringdown signal of the gravitational wave data of GW150914, finding good agreement with nested sampling.
Related papers
- Generative Modeling with Bayesian Sample Inference [50.07758840675341]
We derive a novel generative model from the simple act of Gaussian posterior inference.
Treating the generated sample as an unknown variable to infer lets us formulate the sampling process in the language of Bayesian probability.
Our model uses a sequence of prediction and posterior update steps to narrow down the unknown sample from a broad initial belief.
arXiv Detail & Related papers (2025-02-11T14:27:10Z) - Gaussian credible intervals in Bayesian nonparametric estimation of the unseen [7.54430260415628]
unseen-species problem assumes $ngeq1$ samples from a population of individuals belonging to different species, possibly infinite.
We propose a novel methodology to derive large $m$ credible intervals for $K_n,m$, for any $ngeq1$.
arXiv Detail & Related papers (2025-01-27T12:48:05Z) - Amortized Posterior Sampling with Diffusion Prior Distillation [55.03585818289934]
We propose a variational inference approach to sample from the posterior distribution for solving inverse problems.
We show that our method is applicable to standard signals in Euclidean space, as well as signals on manifold.
arXiv Detail & Related papers (2024-07-25T09:53:12Z) - Active Diffusion Subsampling [15.028061496012924]
In maximum-entropy sampling, one selects measurement locations that are expected to have the highest entropy, so as to minimize uncertainty about $x$.
Recently, diffusion models have been shown to produce high-quality posterior samples of high-dimensional signals using guided diffusion.
We propose Active Diffusion Subsampling (ADS), a method for performing active subsampling using guided diffusion.
arXiv Detail & Related papers (2024-06-20T15:05:06Z) - Consistency Model is an Effective Posterior Sample Approximation for Diffusion Inverse Solvers [28.678613691787096]
Previous approximations rely on the posterior means, which may not lie in the support of the image distribution.
We introduce a novel approach for posterior approximation that guarantees to generate valid samples within the support of the image distribution.
arXiv Detail & Related papers (2024-02-09T02:23:47Z) - Posterior samples of source galaxies in strong gravitational lenses with
score-based priors [107.52670032376555]
We use a score-based model to encode the prior for the inference of undistorted images of background galaxies.
We show how the balance between the likelihood and the prior meet our expectations in an experiment with out-of-distribution data.
arXiv Detail & Related papers (2022-11-07T19:00:42Z) - DensePure: Understanding Diffusion Models towards Adversarial Robustness [110.84015494617528]
We analyze the properties of diffusion models and establish the conditions under which they can enhance certified robustness.
We propose a new method DensePure, designed to improve the certified robustness of a pretrained model (i.e. a classifier)
We show that this robust region is a union of multiple convex sets, and is potentially much larger than the robust regions identified in previous works.
arXiv Detail & Related papers (2022-11-01T08:18:07Z) - Accelerated Bayesian SED Modeling using Amortized Neural Posterior
Estimation [0.0]
We present an alternative scalable approach to rigorous Bayesian inference using Amortized Neural Posterior Estimation (ANPE)
ANPE is a simulation-based inference method that employs neural networks to estimate the posterior probability distribution.
We present, and publicly release, $rm SEDflow$, an ANPE method to produce posteriors of the recent Hahn et al. (2022) SED model from optical photometry.
arXiv Detail & Related papers (2022-03-14T18:00:03Z) - Instance-Optimal Compressed Sensing via Posterior Sampling [101.43899352984774]
We show for Gaussian measurements and emphany prior distribution on the signal, that the posterior sampling estimator achieves near-optimal recovery guarantees.
We implement the posterior sampling estimator for deep generative priors using Langevin dynamics, and empirically find that it produces accurate estimates with more diversity than MAP.
arXiv Detail & Related papers (2021-06-21T22:51:56Z) - Tracking disease outbreaks from sparse data with Bayesian inference [55.82986443159948]
The COVID-19 pandemic provides new motivation for estimating the empirical rate of transmission during an outbreak.
Standard methods struggle to accommodate the partial observability and sparse data common at finer scales.
We propose a Bayesian framework which accommodates partial observability in a principled manner.
arXiv Detail & Related papers (2020-09-12T20:37:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.