Transport Reversible Jump Proposals
- URL: http://arxiv.org/abs/2210.12572v1
- Date: Sat, 22 Oct 2022 23:48:04 GMT
- Title: Transport Reversible Jump Proposals
- Authors: Laurence Davies, Robert Salomone, Matthew Sutton, Christopher Drovandi
- Abstract summary: We show an approach to enhance the efficiency of RJMCMC sampling by performing transdimensional jumps involving reference distributions.
It is shown that, in the setting where exact transports are used, our RJMCMC proposals have the desirable property that the acceptance probability depends only on the model probabilities.
- Score: 0.8399688944263843
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Reversible jump Markov chain Monte Carlo (RJMCMC) proposals that achieve
reasonable acceptance rates and mixing are notoriously difficult to design in
most applications. Inspired by recent advances in deep neural network-based
normalizing flows and density estimation, we demonstrate an approach to enhance
the efficiency of RJMCMC sampling by performing transdimensional jumps
involving reference distributions. In contrast to other RJMCMC proposals, the
proposed method is the first to apply a non-linear transport-based approach to
construct efficient proposals between models with complicated dependency
structures. It is shown that, in the setting where exact transports are used,
our RJMCMC proposals have the desirable property that the acceptance
probability depends only on the model probabilities. Numerical experiments
demonstrate the efficacy of the approach.
Related papers
- Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Repelling-Attracting Hamiltonian Monte Carlo [0.8158530638728501]
We propose a variant of Hamiltonian Monte Carlo, called the Repelling-Attracting Hamiltonian Monte Carlo (RAHMC)
RAHMC involves two stages: a mode-repelling stage to encourage the sampler to move away from regions of high probability density; and, a mode-attracting stage, which facilitates the sampler to find and settle near alternative modes.
arXiv Detail & Related papers (2024-03-07T15:54:55Z) - Improving Transferability of Adversarial Examples via Bayesian Attacks [84.90830931076901]
We introduce a novel extension by incorporating the Bayesian formulation into the model input as well, enabling the joint diversification of both the model input and model parameters.
Our method achieves a new state-of-the-art on transfer-based attacks, improving the average success rate on ImageNet and CIFAR-10 by 19.14% and 2.08%, respectively.
arXiv Detail & Related papers (2023-07-21T03:43:07Z) - Dense Hybrid Proposal Modulation for Lane Detection [72.49084826234363]
We present a dense hybrid proposal modulation (DHPM) method for lane detection.
We densely modulate all proposals to generate topologically and spatially high-quality lane predictions.
Our DHPM achieves very competitive performances on four popular datasets.
arXiv Detail & Related papers (2023-04-28T14:31:11Z) - On Sampling with Approximate Transport Maps [22.03230737620495]
Transport maps can ease the sampling of distributions with non-trivial geometries by transforming them into distributions that are easier to handle.
The potential of this approach has risen with the development of Normalizing Flows (NF) which are maps parameterized with deep neural networks trained to push a reference distribution towards a target.
NF-enhanced samplers recently proposed blend (Markov chain) Monte Carlo methods with either (i) proposal draws from the flow or (ii) a flow-based reparametrization.
arXiv Detail & Related papers (2023-02-09T16:52:52Z) - Online Probabilistic Model Identification using Adaptive Recursive MCMC [8.465242072268019]
We suggest the Adaptive Recursive Markov Chain Monte Carlo (ARMCMC) method.
It eliminates the shortcomings of conventional online techniques while computing the entire probability density function of model parameters.
We demonstrate our approach using parameter estimation in a soft bending actuator and the Hunt-Crossley dynamic model.
arXiv Detail & Related papers (2022-10-23T02:06:48Z) - LSB: Local Self-Balancing MCMC in Discrete Spaces [2.385916960125935]
This work considers using machine learning to adapt the proposal distribution to the target, in order to improve the sampling efficiency in the purely discrete domain.
We call the resulting sampler as the Locally Self-Balancing Sampler (LSB)
arXiv Detail & Related papers (2021-09-08T18:31:26Z) - Variational Refinement for Importance Sampling Using the Forward
Kullback-Leibler Divergence [77.06203118175335]
Variational Inference (VI) is a popular alternative to exact sampling in Bayesian inference.
Importance sampling (IS) is often used to fine-tune and de-bias the estimates of approximate Bayesian inference procedures.
We propose a novel combination of optimization and sampling techniques for approximate Bayesian inference.
arXiv Detail & Related papers (2021-06-30T11:00:24Z) - Comparing Probability Distributions with Conditional Transport [63.11403041984197]
We propose conditional transport (CT) as a new divergence and approximate it with the amortized CT (ACT) cost.
ACT amortizes the computation of its conditional transport plans and comes with unbiased sample gradients that are straightforward to compute.
On a wide variety of benchmark datasets generative modeling, substituting the default statistical distance of an existing generative adversarial network with ACT is shown to consistently improve the performance.
arXiv Detail & Related papers (2020-12-28T05:14:22Z) - A Neural Network MCMC sampler that maximizes Proposal Entropy [3.4698840925433765]
Augmenting samplers with neural networks can potentially improve their efficiency.
Our network architecture utilizes the gradient of the target distribution for generating proposals.
The adaptive sampler achieves unbiased sampling with significantly higher proposal entropy than Langevin dynamics sampler.
arXiv Detail & Related papers (2020-10-07T18:01:38Z) - Stochastic-Sign SGD for Federated Learning with Theoretical Guarantees [49.91477656517431]
Quantization-based solvers have been widely adopted in Federated Learning (FL)
No existing methods enjoy all the aforementioned properties.
We propose an intuitively-simple yet theoretically-simple method based on SIGNSGD to bridge the gap.
arXiv Detail & Related papers (2020-02-25T15:12:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.