Auxiliary MCMC and particle Gibbs samplers for parallelisable inference
in latent dynamical systems
- URL: http://arxiv.org/abs/2303.00301v1
- Date: Wed, 1 Mar 2023 07:53:58 GMT
- Title: Auxiliary MCMC and particle Gibbs samplers for parallelisable inference
in latent dynamical systems
- Authors: Adrien Corenflos and Simo S\"arkk\"a
- Abstract summary: We introduce two new classes of exact Markov chain Monte Carlo (MCMC) samplers for inference in latent dynamical models.
The first one, which we coin auxiliary Kalman samplers, relies on finding a linear Gaussian state-space model approximation around the running trajectory corresponding to the state of the Markov chain.
The second, that we name auxiliary particle Gibbs samplers corresponds to deriving good local proposals in an auxiliary Feynman--Kac model for use in particle Gibbs.
- Score: 3.42658286826597
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce two new classes of exact Markov chain Monte Carlo (MCMC)
samplers for inference in latent dynamical models. The first one, which we coin
auxiliary Kalman samplers, relies on finding a linear Gaussian state-space
model approximation around the running trajectory corresponding to the state of
the Markov chain. The second, that we name auxiliary particle Gibbs samplers
corresponds to deriving good local proposals in an auxiliary Feynman--Kac model
for use in particle Gibbs. Both samplers are controlled by augmenting the
target distribution with auxiliary observations, resulting in an efficient
Gibbs sampling routine. We discuss the relative statistical and computational
performance of the samplers introduced, and show how to parallelise the
auxiliary samplers along the time dimension. We illustrate the respective
benefits and drawbacks of the resulting algorithms on classical examples from
the particle filtering literature.
Related papers
- von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Scalability of Metropolis-within-Gibbs schemes for high-dimensional Bayesian models [0.0]
We study general coordinate-wise MCMC schemes (such as Metropolis-within-Gibbs samplers)
We relate their convergence properties to the ones of the corresponding Gibbs sampler through the notion of conditional conductance.
This allows us to study the performances of popular Metropolis-within-Gibbs schemes for non-conjugate hierarchical models.
arXiv Detail & Related papers (2024-03-14T14:04:44Z) - Iterated Denoising Energy Matching for Sampling from Boltzmann Densities [109.23137009609519]
Iterated Denoising Energy Matching (iDEM)
iDEM alternates between (I) sampling regions of high model density from a diffusion-based sampler and (II) using these samples in our matching objective.
We show that the proposed approach achieves state-of-the-art performance on all metrics and trains $2-5times$ faster.
arXiv Detail & Related papers (2024-02-09T01:11:23Z) - Entropy-based Training Methods for Scalable Neural Implicit Sampler [15.978655106034113]
Efficiently sampling from un-normalized target distributions is a fundamental problem in scientific computing and machine learning.
In this paper, we propose an efficient and scalable neural implicit sampler that overcomes these limitations.
Our sampler can generate large batches of samples with low computational costs by leveraging a neural transformation that directly maps easily sampled latent vectors to target samples.
arXiv Detail & Related papers (2023-06-08T05:56:05Z) - Unrolling Particles: Unsupervised Learning of Sampling Distributions [102.72972137287728]
Particle filtering is used to compute good nonlinear estimates of complex systems.
We show in simulations that the resulting particle filter yields good estimates in a wide range of scenarios.
arXiv Detail & Related papers (2021-10-06T16:58:34Z) - Direct sampling of projected entangled-pair states [0.0]
Variational Monte Carlo studies employing projected entangled-pair states (PEPS) have recently shown that they can provide answers on long-standing questions.
We propose a sampling algorithm that generates independent samples from a PEPS, bypassing all problems related to finite autocorrelation times.
arXiv Detail & Related papers (2021-09-15T15:09:20Z) - A fast asynchronous MCMC sampler for sparse Bayesian inference [10.535140830570256]
We propose a very fast approximate Markov Chain Monte Carlo (MCMC) sampling framework that is applicable to a large class of sparse Bayesian inference problems.
We show that in high-dimensional linear regression problems, the Markov chain generated by the proposed algorithm admits an invariant distribution that recovers correctly the main signal.
arXiv Detail & Related papers (2021-08-14T02:20:49Z) - Deterministic Gibbs Sampling via Ordinary Differential Equations [77.42706423573573]
This paper presents a general construction of deterministic measure-preserving dynamics using autonomous ODEs and tools from differential geometry.
We show how Hybrid Monte Carlo and other deterministic samplers follow as special cases of our theory.
arXiv Detail & Related papers (2021-06-18T15:36:09Z) - Oops I Took A Gradient: Scalable Sampling for Discrete Distributions [53.3142984019796]
We show that this approach outperforms generic samplers in a number of difficult settings.
We also demonstrate the use of our improved sampler for training deep energy-based models on high dimensional discrete data.
arXiv Detail & Related papers (2021-02-08T20:08:50Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.