The Boomerang Sampler
- URL: http://arxiv.org/abs/2006.13777v2
- Date: Tue, 11 Aug 2020 15:54:39 GMT
- Title: The Boomerang Sampler
- Authors: Joris Bierkens, Sebastiano Grazzi, Kengo Kamatani, Gareth Roberts
- Abstract summary: This paper introduces the Boomerang Sampler as a novel class of continuous-time non-reversible Markov chain Monte Carlo algorithms.
We demonstrate that the method is easy to implement and demonstrate empirically that it can out-perform existing benchmark piecewise deterministic Markov processes.
- Score: 4.588028371034406
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper introduces the Boomerang Sampler as a novel class of
continuous-time non-reversible Markov chain Monte Carlo algorithms. The
methodology begins by representing the target density as a density, $e^{-U}$,
with respect to a prescribed (usually) Gaussian measure and constructs a
continuous trajectory consisting of a piecewise elliptical path. The method
moves from one elliptical orbit to another according to a rate function which
can be written in terms of $U$. We demonstrate that the method is easy to
implement and demonstrate empirically that it can out-perform existing
benchmark piecewise deterministic Markov processes such as the bouncy particle
sampler and the Zig-Zag. In the Bayesian statistics context, these competitor
algorithms are of substantial interest in the large data context due to the
fact that they can adopt data subsampling techniques which are exact (ie induce
no error in the stationary distribution). We demonstrate theoretically and
empirically that we can also construct a control-variate subsampling boomerang
sampler which is also exact, and which possesses remarkable scaling properties
in the large data limit. We furthermore illustrate a factorised version on the
simulation of diffusion bridges.
Related papers
- A Practical Diffusion Path for Sampling [8.174664278172367]
Diffusion models are used in generative modeling to estimate score vectors guiding a Langevin process.
Previous approaches rely on Monte Carlo estimators that are either computationally heavy to implement or sample-inefficient.
We propose a computationally attractive alternative, relying on the so-called dilation path, that yields score vectors that are available in closed-form.
arXiv Detail & Related papers (2024-06-20T07:00:56Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Chain of Log-Concave Markov Chains [2.9465623430708905]
We introduce a theoretical framework for sampling from unnormalized densities.
We prove one can decompose sampling from a density into a sequence of sampling from log-concave conditional densities.
We report a remarkable capacity of our algorithm to "tunnel" between modes of a distribution.
arXiv Detail & Related papers (2023-05-31T01:00:35Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Machine Learning Trivializing Maps: A First Step Towards Understanding
How Flow-Based Samplers Scale Up [0.6445605125467573]
We show that approximations of trivializing maps can be machine-learned' by a class of invertible, differentiable models.
We conduct an exploratory scaling study using two-dimensional $phi4$ with up to $202$ lattice sites.
arXiv Detail & Related papers (2021-12-31T16:17:19Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Generative Learning With Euler Particle Transport [14.557451744544592]
We propose an Euler particle transport (EPT) approach for generative learning.
The proposed approach is motivated by the problem of finding an optimal transport map from a reference distribution to a target distribution.
We show that the proposed density-ratio (difference) estimators do not suffer from the "curse of dimensionality" if data is supported on a lower-dimensional manifold.
arXiv Detail & Related papers (2020-12-11T03:10:53Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z) - Nonparametric Bayesian volatility learning under microstructure noise [2.812395851874055]
We study the problem of learning the volatility under market microstructure noise.
Specifically, we consider noisy discrete time observations from a differential equation.
We develop a novel computational method to learn the diffusion coefficient of the equation.
arXiv Detail & Related papers (2018-05-15T07:32:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.