Nested sampling with any prior you like
- URL: http://arxiv.org/abs/2102.12478v1
- Date: Wed, 24 Feb 2021 18:45:13 GMT
- Title: Nested sampling with any prior you like
- Authors: Justin Alsing and Will Handley
- Abstract summary: Bijectors trained on samples from a desired prior density provide a general-purpose method for constructing transformations.
We demonstrate the use of trained bijectors in conjunction with nested sampling on a number of examples from cosmology.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Nested sampling is an important tool for conducting Bayesian analysis in
Astronomy and other fields, both for sampling complicated posterior
distributions for parameter inference, and for computing marginal likelihoods
for model comparison. One technical obstacle to using nested sampling in
practice is the requirement that prior distributions be provided in the form of
bijective transformations from the unit hyper-cube to the target prior density.
For many applications - particularly when using the posterior from one
experiment as the prior for another - such a transformation is not readily
available. In this letter we show that parametric bijectors trained on samples
from a desired prior density provide a general-purpose method for constructing
transformations from the uniform base density to a target prior, enabling the
practical use of nested sampling under arbitrary priors. We demonstrate the use
of trained bijectors in conjunction with nested sampling on a number of
examples from cosmology.
Related papers
- Low-rank Bayesian matrix completion via geodesic Hamiltonian Monte Carlo on Stiefel manifolds [0.18416014644193066]
We present a new sampling-based approach for enabling efficient computation of low-rank Bayesian matrix completion.
We show that our approach resolves the sampling difficulties encountered by standard Gibbs samplers for the common two-matrix factorization used in matrix completion.
Numerical examples demonstrate superior sampling performance, including better mixing and faster convergence to a stationary distribution.
arXiv Detail & Related papers (2024-10-27T03:12:53Z) - Unified Convergence Analysis for Score-Based Diffusion Models with Deterministic Samplers [49.1574468325115]
We introduce a unified convergence analysis framework for deterministic samplers.
Our framework achieves iteration complexity of $tilde O(d2/epsilon)$.
We also provide a detailed analysis of Denoising Implicit Diffusion Models (DDIM)-type samplers.
arXiv Detail & Related papers (2024-10-18T07:37:36Z) - Online Posterior Sampling with a Diffusion Prior [20.24212000441531]
Posterior sampling in contextual bandits with a Gaussian prior can be implemented exactly or approximately using the Laplace approximation.
In this work, we propose approximate posterior sampling algorithms for contextual bandits with a diffusion model prior.
arXiv Detail & Related papers (2024-10-04T20:47:16Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Plug-and-Play split Gibbs sampler: embedding deep generative priors in
Bayesian inference [12.91637880428221]
This paper introduces a plug-and-play sampling algorithm that leverages variable splitting to efficiently sample from a posterior distribution.
It divides the challenging task of posterior sampling into two simpler sampling problems.
Its performance is compared to recent state-of-the-art optimization and sampling methods.
arXiv Detail & Related papers (2023-04-21T17:17:51Z) - Adaptive Sketches for Robust Regression with Importance Sampling [64.75899469557272]
We introduce data structures for solving robust regression through gradient descent (SGD)
Our algorithm effectively runs $T$ steps of SGD with importance sampling while using sublinear space and just making a single pass over the data.
arXiv Detail & Related papers (2022-07-16T03:09:30Z) - Sensing Cox Processes via Posterior Sampling and Positive Bases [56.82162768921196]
We study adaptive sensing of point processes, a widely used model from spatial statistics.
We model the intensity function as a sample from a truncated Gaussian process, represented in a specially constructed positive basis.
Our adaptive sensing algorithms use Langevin dynamics and are based on posterior sampling (textscCox-Thompson) and top-two posterior sampling (textscTop2) principles.
arXiv Detail & Related papers (2021-10-21T14:47:06Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.