Preconditioned Regularized Wasserstein Proximal Sampling
- URL: http://arxiv.org/abs/2509.01685v1
- Date: Mon, 01 Sep 2025 18:04:31 GMT
- Title: Preconditioned Regularized Wasserstein Proximal Sampling
- Authors: Hong Ye Tan, Stanley Osher, Wuchen Li,
- Abstract summary: We consider sampling from a noise-free distribution by evolving finitely many particles.<n>For potentials, we provide a non-asymotic convergence analysis and explicitly the bias, which is dependent on regularization.
- Score: 2.7957842724446174
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider sampling from a Gibbs distribution by evolving finitely many particles. We propose a preconditioned version of a recently proposed noise-free sampling method, governed by approximating the score function with the numerically tractable score of a regularized Wasserstein proximal operator. This is derived by a Cole--Hopf transformation on coupled anisotropic heat equations, yielding a kernel formulation for the preconditioned regularized Wasserstein proximal. The diffusion component of the proposed method is also interpreted as a modified self-attention block, as in transformer architectures. For quadratic potentials, we provide a discrete-time non-asymptotic convergence analysis and explicitly characterize the bias, which is dependent on regularization and independent of step-size. Experiments demonstrate acceleration and particle-level stability on various log-concave and non-log-concave toy examples to Bayesian total-variation regularized image deconvolution, and competitive/better performance on non-convex Bayesian neural network training when utilizing variable preconditioning matrices.
Related papers
- An Elementary Approach to Scheduling in Generative Diffusion Models [55.171367482496755]
An elementary approach to characterizing the impact of noise scheduling and time discretization in generative diffusion models is developed.<n> Experiments across different datasets and pretrained models demonstrate that the time discretization strategy selected by our approach consistently outperforms baseline and search-based strategies.
arXiv Detail & Related papers (2026-01-20T05:06:26Z) - Bregman geometry-aware split Gibbs sampling for Bayesian Poisson inverse problems [8.115032818930457]
We propose a novel framework for solving inverse problems by a Monte Carlo sampling algorithm.<n>We show that the method achieves competitive performance in terms of reconstruction quality.
arXiv Detail & Related papers (2025-11-15T15:27:31Z) - A Free Probabilistic Framework for Denoising Diffusion Models: Entropy, Transport, and Reverse Processes [22.56299060022639]
This paper builds on Voiculescu's theory of free entropy and free Fisher information.<n>We formulate diffusion and quantify reverse processes governed by operator-valued dynamics.<n>The resulting dynamics admit a gradient-flow structure in the noncommutative Wasserstein space.
arXiv Detail & Related papers (2025-10-26T18:03:54Z) - Convergence of Deterministic and Stochastic Diffusion-Model Samplers: A Simple Analysis in Wasserstein Distance [0.0]
We provide convergence guarantees in Wasserstein distance for diffusion-based generative models, covering both (DDPM-like) and deterministic (DDIM-like) sampling methods.<n> Notably, we derive the first Wasserstein convergence bound for the Heun sampler and improve existing results for the sampler of the probability flow ODE.
arXiv Detail & Related papers (2025-08-05T08:37:58Z) - Feynman-Kac Correctors in Diffusion: Annealing, Guidance, and Product of Experts [64.34482582690927]
We provide an efficient and principled method for sampling from a sequence of annealed, geometric-averaged, or product distributions derived from pretrained score-based models.<n>We propose Sequential Monte Carlo (SMC) resampling algorithms that leverage inference-time scaling to improve sampling quality.
arXiv Detail & Related papers (2025-03-04T17:46:51Z) - Latent Schrodinger Bridge: Prompting Latent Diffusion for Fast Unpaired Image-to-Image Translation [58.19676004192321]
Diffusion models (DMs), which enable both image generation from noise and inversion from data, have inspired powerful unpaired image-to-image (I2I) translation algorithms.
We tackle this problem with Schrodinger Bridges (SBs), which are differential equations (SDEs) between distributions with minimal transport cost.
Inspired by this observation, we propose Latent Schrodinger Bridges (LSBs) that approximate the SB ODE via pre-trained Stable Diffusion.
We demonstrate that our algorithm successfully conduct competitive I2I translation in unsupervised setting with only a fraction of cost required by previous DM-
arXiv Detail & Related papers (2024-11-22T11:24:14Z) - Weak Generative Sampler to Efficiently Sample Invariant Distribution of Stochastic Differential Equation [8.67581853745823]
We introduce a framework that employs a weak generative sampler (WGS) to directly generate independent and identically distributed (iid) samples.<n>Our proposed loss function is based on the weak form of the Fokker--Planck equation.
arXiv Detail & Related papers (2024-05-29T16:41:42Z) - Noise-Free Sampling Algorithms via Regularized Wasserstein Proximals [3.4240632942024685]
We consider the problem of sampling from a distribution governed by a potential function.
This work proposes an explicit score based MCMC method that is deterministic, resulting in a deterministic evolution for particles.
arXiv Detail & Related papers (2023-08-28T23:51:33Z) - Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic
Analysis For DDIM-Type Samplers [90.45898746733397]
We develop a framework for non-asymptotic analysis of deterministic samplers used for diffusion generative modeling.
We show that one step along the probability flow ODE can be expressed as two steps: 1) a restoration step that runs ascent on the conditional log-likelihood at some infinitesimally previous time, and 2) a degradation step that runs the forward process using noise pointing back towards the current gradient.
arXiv Detail & Related papers (2023-03-06T18:59:19Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.