Marginal Post Processing of Bayesian Inference Products with Normalizing
Flows and Kernel Density Estimators
- URL: http://arxiv.org/abs/2205.12841v5
- Date: Mon, 18 Dec 2023 10:47:24 GMT
- Title: Marginal Post Processing of Bayesian Inference Products with Normalizing
Flows and Kernel Density Estimators
- Authors: Harry T. J. Bevins, William J. Handley, Pablo Lemos, Peter H. Sims,
Eloy de Lera Acedo, Anastasia Fialkov, Justin Alsing
- Abstract summary: We use Masked Autoregressive Flows and Kernel Density Estimators to learn marginal posterior densities corresponding to core science parameters.
We find that the marginal or 'nuisance-free' posteriors and the associated likelihoods have an abundance of applications.
- Score: 0.4397520291340696
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian analysis has become an indispensable tool across many different
cosmological fields including the study of gravitational waves, the Cosmic
Microwave Background and the 21-cm signal from the Cosmic Dawn among other
phenomena. The method provides a way to fit complex models to data describing
key cosmological and astrophysical signals and a whole host of contaminating
signals and instrumental effects modelled with `nuisance parameters'. In this
paper, we summarise a method that uses Masked Autoregressive Flows and Kernel
Density Estimators to learn marginal posterior densities corresponding to core
science parameters. We find that the marginal or 'nuisance-free' posteriors and
the associated likelihoods have an abundance of applications including; the
calculation of previously intractable marginal Kullback-Leibler divergences and
marginal Bayesian Model Dimensionalities, likelihood emulation and prior
emulation. We demonstrate each application using toy examples, examples from
the field of 21-cm cosmology and samples from the Dark Energy Survey. We
discuss how marginal summary statistics like the Kullback-Leibler divergences
and Bayesian Model Dimensionalities can be used to examine the constraining
power of different experiments and how we can perform efficient joint analysis
by taking advantage of marginal prior and likelihood emulators. We package our
multipurpose code up in the pip-installable code margarine for use in the wider
scientific community.
Related papers
- A Likelihood Based Approach to Distribution Regression Using Conditional Deep Generative Models [6.647819824559201]
We study the large-sample properties of a likelihood-based approach for estimating conditional deep generative models.
Our results lead to the convergence rate of a sieve maximum likelihood estimator for estimating the conditional distribution.
arXiv Detail & Related papers (2024-10-02T20:46:21Z) - A comparison of Bayesian sampling algorithms for high-dimensional particle physics and cosmology applications [0.0]
We review and compare a wide range of Markov Chain Monte Carlo (MCMC) and nested sampling techniques.
We show that several examples widely thought to be most easily solved using nested sampling approaches can in fact be more efficiently solved using modern MCMC algorithms.
arXiv Detail & Related papers (2024-09-27T05:57:48Z) - $\mathtt{emuflow}$: Normalising Flows for Joint Cosmological Analysis [0.0]
We show that normalising flows can be used to efficiently combine cosmological constraints from independent datasets.
We show that the method is able to accurately describe the posterior distribution of real cosmological datasets.
The resulting joint constraints can be obtained in a fraction of the time it would take to combine the same datasets at the level of their likelihoods.
arXiv Detail & Related papers (2024-09-02T18:04:14Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Conformal inference for regression on Riemannian Manifolds [49.7719149179179]
We investigate prediction sets for regression scenarios when the response variable, denoted by $Y$, resides in a manifold, and the covariable, denoted by X, lies in Euclidean space.
We prove the almost sure convergence of the empirical version of these regions on the manifold to their population counterparts.
arXiv Detail & Related papers (2023-10-12T10:56:25Z) - Probabilistic Mass Mapping with Neural Score Estimation [4.079848600120986]
We introduce a novel methodology for efficient sampling of the high-dimensional Bayesian posterior of the weak lensing mass-mapping problem.
We aim to demonstrate the accuracy of the method on simulations, and then proceed to applying it to the mass reconstruction of the HST/ACS COSMOS field.
arXiv Detail & Related papers (2022-01-14T17:07:48Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Towards constraining warm dark matter with stellar streams through
neural simulation-based inference [7.608718235345664]
We introduce a likelihood-free Bayesian inference pipeline based on Amortised Approximate Likelihood Ratios (AALR)
We apply the method to the simplified case where stellar streams are only perturbed by dark matter subhaloes.
arXiv Detail & Related papers (2020-11-30T15:53:43Z) - Generalized Sliced Distances for Probability Distributions [47.543990188697734]
We introduce a broad family of probability metrics, coined as Generalized Sliced Probability Metrics (GSPMs)
GSPMs are rooted in the generalized Radon transform and come with a unique geometric interpretation.
We consider GSPM-based gradient flows for generative modeling applications and show that under mild assumptions, the gradient flow converges to the global optimum.
arXiv Detail & Related papers (2020-02-28T04:18:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.