Sampling from multi-modal distributions on Riemannian manifolds with training-free stochastic interpolants
- URL: http://arxiv.org/abs/2602.00641v1
- Date: Sat, 31 Jan 2026 10:17:44 GMT
- Title: Sampling from multi-modal distributions on Riemannian manifolds with training-free stochastic interpolants
- Authors: Alain Durmus, Maxence Noble, Thibaut Pellerin,
- Abstract summary: We introduce a sampling algorithm based on the simulation of a non-equilibrium deterministic dynamics that transports an easy-to-sample noise distribution toward the target.<n>In contrast to related generative modeling approaches that rely on machine learning, our method is entirely training-free.
- Score: 17.07401986649233
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we propose a general methodology for sampling from un-normalized densities defined on Riemannian manifolds, with a particular focus on multi-modal targets that remain challenging for existing sampling methods. Inspired by the framework of diffusion models developed for generative modeling, we introduce a sampling algorithm based on the simulation of a non-equilibrium deterministic dynamics that transports an easy-to-sample noise distribution toward the target. At the marginal level, the induced density path follows a prescribed stochastic interpolant between the noise and target distributions, specifically constructed to respect the underlying Riemannian geometry. In contrast to related generative modeling approaches that rely on machine learning, our method is entirely training-free. It instead builds on iterative posterior sampling procedures using only standard Monte Carlo techniques, thereby extending recent diffusion-based sampling methodologies beyond the Euclidean setting. We complement our approach with a rigorous theoretical analysis and demonstrate its effectiveness on a range of multi-modal sampling problems, including high-dimensional and heavy-tailed examples.
Related papers
- TFTF: Training-Free Targeted Flow for Conditional Sampling [1.4151684142137693]
We propose a training-free conditional sampling method for flow matching models based on importance sampling.<n>Because a nave application of importance sampling suffers from weighteneracy in high-dimensional settings, we modify and incorporate a resampling technique in sequential Monte Carlo.<n>Our framework requires no additional training, while providing theoretical guarantees of accuracy.
arXiv Detail & Related papers (2026-02-13T13:41:35Z) - An Elementary Approach to Scheduling in Generative Diffusion Models [55.171367482496755]
An elementary approach to characterizing the impact of noise scheduling and time discretization in generative diffusion models is developed.<n> Experiments across different datasets and pretrained models demonstrate that the time discretization strategy selected by our approach consistently outperforms baseline and search-based strategies.
arXiv Detail & Related papers (2026-01-20T05:06:26Z) - Sampling by averaging: A multiscale approach to score estimation [4.003851730099099]
We introduce a novel framework for efficient sampling from complex, unnormalised target distributions by exploiting multiscale dynamics.<n>Two algorithms are developed: MultALMC and MultCDiff, based on multiscale controlled diffusions for the reverse-time Ornstein-Uhlenbeck process.<n>The framework is extended to handle heavy-dimensional target distributions using Student's t-based noise models and tailored fast-process dynamics.
arXiv Detail & Related papers (2025-08-20T21:09:34Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Stochastic Localization via Iterative Posterior Sampling [2.1383136715042417]
We consider a general localization framework and introduce an explicit class of observation processes, associated with flexible denoising schedules.
We provide a complete methodology, $textitStochastic localization via Iterative Posterior Sampling$ (SLIPS), to obtain approximate samples of this dynamics, and as a byproduct, samples from the target distribution.
We illustrate the benefits and applicability of SLIPS on several benchmarks of multi-modal distributions, including mixtures in increasing dimensions, logistic regression and high-dimensional field system from statistical-mechanics.
arXiv Detail & Related papers (2024-02-16T15:28:41Z) - Improved off-policy training of diffusion samplers [93.66433483772055]
We study the problem of training diffusion models to sample from a distribution with an unnormalized density or energy function.<n>We benchmark several diffusion-structured inference methods, including simulation-based variational approaches and off-policy methods.<n>Our results shed light on the relative advantages of existing algorithms while bringing into question some claims from past work.
arXiv Detail & Related papers (2024-02-07T18:51:49Z) - Diffusive Gibbs Sampling [40.1197715949575]
We propose Diffusive Gibbs Sampling (DiGS) for effective sampling from distributions characterized by distant and disconnected modes.
DiGS integrates recent developments in diffusion models, leveraging Gaussian convolution to create an auxiliary noisy distribution.
A novel Metropolis-within-Gibbs scheme is proposed to enhance mixing in the denoising sampling step.
arXiv Detail & Related papers (2024-02-05T13:47:41Z) - Gradient-Free Score-Based Sampling Methods with Ensembles [0.0]
We introduce ensembles within score-based sampling methods to develop gradient-free approximate sampling techniques.<n>We demonstrate the efficacy of the ensemble strategies through various examples.<n>Our findings highlight the potential of ensemble strategies for modeling complex probability distributions.
arXiv Detail & Related papers (2024-01-31T01:51:29Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.