Reconstructing the Universe with Variational self-Boosted Sampling
- URL: http://arxiv.org/abs/2206.15433v1
- Date: Tue, 28 Jun 2022 21:30:32 GMT
- Title: Reconstructing the Universe with Variational self-Boosted Sampling
- Authors: Chirag Modi, Yin Li, David Blei
- Abstract summary: Traditional algorithms such as Hamiltonian Monte Carlo (HMC) are computationally inefficient due to generating correlated samples.
Here we develop a hybrid scheme called variational self-boosted sampling (VBS) to mitigate the drawbacks of both algorithms.
VBS generates better quality of samples than simple VI approaches and reduces the correlation length in the sampling phase by a factor of 10-50 over using only HMC.
- Score: 7.922637707393503
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Forward modeling approaches in cosmology have made it possible to reconstruct
the initial conditions at the beginning of the Universe from the observed
survey data. However the high dimensionality of the parameter space still poses
a challenge to explore the full posterior, with traditional algorithms such as
Hamiltonian Monte Carlo (HMC) being computationally inefficient due to
generating correlated samples and the performance of variational inference
being highly dependent on the choice of divergence (loss) function. Here we
develop a hybrid scheme, called variational self-boosted sampling (VBS) to
mitigate the drawbacks of both these algorithms by learning a variational
approximation for the proposal distribution of Monte Carlo sampling and combine
it with HMC. The variational distribution is parameterized as a normalizing
flow and learnt with samples generated on the fly, while proposals drawn from
it reduce auto-correlation length in MCMC chains. Our normalizing flow uses
Fourier space convolutions and element-wise operations to scale to high
dimensions. We show that after a short initial warm-up and training phase, VBS
generates better quality of samples than simple VI approaches and reduces the
correlation length in the sampling phase by a factor of 10-50 over using only
HMC to explore the posterior of initial conditions in 64$^3$ and 128$^3$
dimensional problems, with larger gains for high signal-to-noise data
observations.
Related papers
- Variational Learning of Gaussian Process Latent Variable Models through Stochastic Gradient Annealed Importance Sampling [22.256068524699472]
In this work, we propose an Annealed Importance Sampling (AIS) approach to address these issues.
We combine the strengths of Sequential Monte Carlo samplers and VI to explore a wider range of posterior distributions and gradually approach the target distribution.
Experimental results on both toy and image datasets demonstrate that our method outperforms state-of-the-art methods in terms of tighter variational bounds, higher log-likelihoods, and more robust convergence.
arXiv Detail & Related papers (2024-08-13T08:09:05Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Faster Sampling without Isoperimetry via Diffusion-based Monte Carlo [30.4930148381328]
Diffusion-based Monte Carlo (DMC) is a method to sample from a general target distribution beyond the isoperimetric condition.
DMC encountered high gradient complexity, resulting in an exponential dependency on the error tolerance $epsilon$ of the obtained samples.
We propose RS-DMC, based on a novel recursion-based score estimation method.
Our algorithm is provably much faster than the popular Langevin-based algorithms.
arXiv Detail & Related papers (2024-01-12T02:33:57Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Inverse Models for Estimating the Initial Condition of Spatio-Temporal
Advection-Diffusion Processes [5.814371485767541]
Inverse problems involve making inference about unknown parameters of a physical process using observational data.
This paper investigates the estimation of the initial condition of a-temporal advection-diffusion process using spatially sparse data streams.
arXiv Detail & Related papers (2023-02-08T15:30:16Z) - Gaussian process regression and conditional Karhunen-Lo\'{e}ve models
for data assimilation in inverse problems [68.8204255655161]
We present a model inversion algorithm, CKLEMAP, for data assimilation and parameter estimation in partial differential equation models.
The CKLEMAP method provides better scalability compared to the standard MAP method.
arXiv Detail & Related papers (2023-01-26T18:14:12Z) - Diffusion Posterior Sampling for General Noisy Inverse Problems [50.873313752797124]
We extend diffusion solvers to handle noisy (non)linear inverse problems via approximation of the posterior sampling.
Our method demonstrates that diffusion models can incorporate various measurement noise statistics.
arXiv Detail & Related papers (2022-09-29T11:12:27Z) - Sampling Approximately Low-Rank Ising Models: MCMC meets Variational
Methods [35.24886589614034]
We consider quadratic definite Ising models on the hypercube with a general interaction $J$.
Our general result implies the first time sampling algorithms for low-rank Ising models.
arXiv Detail & Related papers (2022-02-17T21:43:50Z) - Towards Sample-Optimal Compressive Phase Retrieval with Sparse and
Generative Priors [59.33977545294148]
We show that $O(k log L)$ samples suffice to guarantee that the signal is close to any vector that minimizes an amplitude-based empirical loss function.
We adapt this result to sparse phase retrieval, and show that $O(s log n)$ samples are sufficient for a similar guarantee when the underlying signal is $s$-sparse and $n$-dimensional.
arXiv Detail & Related papers (2021-06-29T12:49:54Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.