Bouncy particle sampler with infinite exchanging parallel tempering
- URL: http://arxiv.org/abs/2509.02003v1
- Date: Tue, 02 Sep 2025 06:37:57 GMT
- Title: Bouncy particle sampler with infinite exchanging parallel tempering
- Authors: Yohei Saito, Shun Kimura, Koujin Takeda,
- Abstract summary: We employ the variational Bayesian inference or sampling method to approximate posterior distributions.<n>A bouncy particle sampler (BPS) has been proposed, which combines uniform linear motion and reflection to perform sampling.<n>We performed numerical simulations and demonstrated its effectiveness for multimodal distribution.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian inference is useful to obtain a predictive distribution with a small generalization error. However, since posterior distributions are rarely evaluated analytically, we employ the variational Bayesian inference or sampling method to approximate posterior distributions. When we obtain samples from a posterior distribution, Hamiltonian Monte Carlo (HMC) has been widely used for the continuous variable part and Markov chain Monte Carlo (MCMC) for the discrete variable part. Another sampling method, the bouncy particle sampler (BPS), has been proposed, which combines uniform linear motion and stochastic reflection to perform sampling. BPS was reported to have the advantage of being easier to set simulation parameters than HMC. To accelerate the convergence to a posterior distribution, we introduced parallel tempering (PT) to BPS, and then proposed an algorithm when the inverse temperature exchange rate is set to infinity. We performed numerical simulations and demonstrated its effectiveness for multimodal distribution.
Related papers
- An Elementary Approach to Scheduling in Generative Diffusion Models [55.171367482496755]
An elementary approach to characterizing the impact of noise scheduling and time discretization in generative diffusion models is developed.<n> Experiments across different datasets and pretrained models demonstrate that the time discretization strategy selected by our approach consistently outperforms baseline and search-based strategies.
arXiv Detail & Related papers (2026-01-20T05:06:26Z) - Sampling from multimodal distributions with warm starts: Non-asymptotic bounds for the Reweighted Annealed Leap-Point Sampler [10.161956880665734]
We introduce Reweighted ALPS (Re-ALPS), a modified approximation of the Annealed Leap-Point Sampler (ALPS)<n>We define distributions tilted towards a mixture centered at the warm start points, and at the coldest level, use teleportation between warm start points to enable efficient mixing across modes.<n>In contrast to ALPS, our method does not require Hessian information at the modes, but instead estimates component partition functions via Monte Carlo.
arXiv Detail & Related papers (2025-12-19T12:11:16Z) - Restricted Spectral Gap Decomposition for Simulated Tempering Targeting Mixture Distributions [3.7577421880330535]
We consider simulated tempering combined with an arbitrary local chain Monte Carlo sampler.<n>We present a new decomposition theorem that provides a lower bound on the restricted spectral gap of the algorithm for sampling from mixture distributions.
arXiv Detail & Related papers (2025-05-21T03:28:55Z) - Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional dependencies for general score-mismatched diffusion samplers.<n>We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.<n>This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - Persistent Sampling: Enhancing the Efficiency of Sequential Monte Carlo [0.0]
Sequential Monte Carlo (SMC) samplers are powerful tools for Bayesian inference but suffer from high computational costs.<n>We introduce persistent sampling (PS), which retains SMC and constructs particles from all prior iterations.
arXiv Detail & Related papers (2024-07-30T10:34:40Z) - Entropy-MCMC: Sampling from Flat Basins with Ease [10.764160559530849]
We introduce an auxiliary guiding variable, the stationary distribution of which resembles a smoothed posterior free from sharp modes, to lead the MCMC sampler to flat basins.
By integrating this guiding variable with the model parameter, we create a simple joint distribution that enables efficient sampling with minimal computational overhead.
Empirical results demonstrate that our method can successfully sample from flat basins of the posterior, and outperforms all compared baselines on multiple benchmarks.
arXiv Detail & Related papers (2023-10-09T04:40:20Z) - Noise-Free Sampling Algorithms via Regularized Wasserstein Proximals [3.4240632942024685]
We consider the problem of sampling from a distribution governed by a potential function.
This work proposes an explicit score based MCMC method that is deterministic, resulting in a deterministic evolution for particles.
arXiv Detail & Related papers (2023-08-28T23:51:33Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Continuously-Tempered PDMP Samplers [2.294014185517203]
We show how tempering ideas can improve the mixing of piece-wise deterministic Markov processes.
We introduce an extended distribution defined over the state of the posterior distribution and an inverse temperature.
We show how PDMPs, and particularly the Zig-Zag sampler, can be implemented to sample from such an extended distribution.
arXiv Detail & Related papers (2022-05-19T13:32:44Z) - Unrolling Particles: Unsupervised Learning of Sampling Distributions [102.72972137287728]
Particle filtering is used to compute good nonlinear estimates of complex systems.
We show in simulations that the resulting particle filter yields good estimates in a wide range of scenarios.
arXiv Detail & Related papers (2021-10-06T16:58:34Z) - Relative Entropy Gradient Sampler for Unnormalized Distributions [14.060615420986796]
Relative entropy gradient sampler (REGS) for sampling from unnormalized distributions.
REGS is a particle method that seeks a sequence of simple nonlinear transforms iteratively pushing the initial samples from a reference distribution into the samples from an unnormalized target distribution.
arXiv Detail & Related papers (2021-10-06T14:10:38Z) - Variational Refinement for Importance Sampling Using the Forward
Kullback-Leibler Divergence [77.06203118175335]
Variational Inference (VI) is a popular alternative to exact sampling in Bayesian inference.
Importance sampling (IS) is often used to fine-tune and de-bias the estimates of approximate Bayesian inference procedures.
We propose a novel combination of optimization and sampling techniques for approximate Bayesian inference.
arXiv Detail & Related papers (2021-06-30T11:00:24Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.