High-accuracy and dimension-free sampling with diffusions
- URL: http://arxiv.org/abs/2601.10708v1
- Date: Thu, 15 Jan 2026 18:58:50 GMT
- Title: High-accuracy and dimension-free sampling with diffusions
- Authors: Khashayar Gatmiry, Sitan Chen, Adil Salim,
- Abstract summary: We propose a new solver for diffusion models relying on a subtle interplay between low-degree approximation and the collocation method.<n>We prove that its complexity scales emphpolylogarithmically in $1/varepsilon$, yielding the first high-accuracy' guarantee for a diffusion-based sampler.
- Score: 27.7060066305274
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Diffusion models have shown remarkable empirical success in sampling from rich multi-modal distributions. Their inference relies on numerically solving a certain differential equation. This differential equation cannot be solved in closed form, and its resolution via discretization typically requires many small iterations to produce \emph{high-quality} samples. More precisely, prior works have shown that the iteration complexity of discretization methods for diffusion models scales polynomially in the ambient dimension and the inverse accuracy $1/\varepsilon$. In this work, we propose a new solver for diffusion models relying on a subtle interplay between low-degree approximation and the collocation method (Lee, Song, Vempala 2018), and we prove that its iteration complexity scales \emph{polylogarithmically} in $1/\varepsilon$, yielding the first ``high-accuracy'' guarantee for a diffusion-based sampler that only uses (approximate) access to the scores of the data distribution. In addition, our bound does not depend explicitly on the ambient dimension; more precisely, the dimension affects the complexity of our solver through the \emph{effective radius} of the support of the target distribution only.
Related papers
- Efficient Sampling with Discrete Diffusion Models: Sharp and Adaptive Guarantees [9.180350432640912]
We study the sampling efficiency of score-based discrete diffusion models under a continuous-time Markov chain (CTMC) formulation.<n>For uniform discrete diffusion, we show that the $$-leaping algorithm achieves an complexity of order $tilde O(d/varepsilon)$.<n>For masking discrete diffusion, we introduce a modified $$-leaping sampler whose convergence rate is governed by an intrinsic information-theoretic quantity.
arXiv Detail & Related papers (2026-02-16T18:48:17Z) - Polynomial Convergence of Riemannian Diffusion Models [46.72936436234762]
Diffusion models are considered one of the state-of-the-art generative models in modern AI.<n>Most of the existing literature assumes that the underlying space is Euclidean.<n>In many practical applications, the data are constrained to lie on a submanifold of Euclidean space.
arXiv Detail & Related papers (2026-01-05T19:14:09Z) - Score-based sampling without diffusions: Guidance from a simple and modular scheme [0.0]
We show how to design forward trajectories that both (a) the terminal distribution, and (b) each of the backward conditional distribution is defined by a strongly log concave (SLC) distribution.<n>This modular reduction allows us to exploit emphany SLC sampling algorithm in order to traverse the backwards path.<n>The use of high-accuracy routines yields $varepsilon$-accurate answers, in either KL or Wasserstein distances.
arXiv Detail & Related papers (2025-12-30T11:34:59Z) - From Score Matching to Diffusion: A Fine-Grained Error Analysis in the Gaussian Setting [25.21429354164613]
We show that the Wasserstein sampling error can be expressed as a kernel-type norm of the data power spectrum.<n>We show that the Wasserstein sampling error can be expressed as a kernel-type norm of the data power spectrum.
arXiv Detail & Related papers (2025-03-14T17:35:00Z) - Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional dependencies for general score-mismatched diffusion samplers.<n>We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.<n>This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - Instance-dependent Convergence Theory for Diffusion Models [7.237817437521988]
We develop a convergence rate that is adaptive to the smoothness of different target distributions, referred to as instance-dependent bound.<n>In addition, $L$ represents a relaxed Lipschitz constant, which, in the case of Gaussian mixture models, scales only logarithmically with the number of components.
arXiv Detail & Related papers (2024-10-17T16:37:33Z) - Non-asymptotic bounds for forward processes in denoising diffusions: Ornstein-Uhlenbeck is hard to beat [49.1574468325115]
This paper presents explicit non-asymptotic bounds on the forward diffusion error in total variation (TV)<n>We parametrise multi-modal data distributions in terms of the distance $R$ to their furthest modes and consider forward diffusions with additive and multiplicative noise.
arXiv Detail & Related papers (2024-08-25T10:28:31Z) - Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - Broadening Target Distributions for Accelerated Diffusion Models via a Novel Analysis Approach [49.97755400231656]
We show that a new accelerated DDPM sampler achieves accelerated performance for three broad distribution classes not considered before.<n>Our results show an improved dependency on the data dimension $d$ among accelerated DDPM type samplers.
arXiv Detail & Related papers (2024-02-21T16:11:47Z) - Sampling and estimation on manifolds using the Langevin diffusion [45.57801520690309]
Two estimators of linear functionals of $mu_phi $ based on the discretized Markov process are considered.<n>Error bounds are derived for sampling and estimation using a discretization of an intrinsically defined Langevin diffusion.
arXiv Detail & Related papers (2023-12-22T18:01:11Z) - Semi-Implicit Denoising Diffusion Models (SIDDMs) [50.30163684539586]
Existing models such as Denoising Diffusion Probabilistic Models (DDPM) deliver high-quality, diverse samples but are slowed by an inherently high number of iterative steps.
We introduce a novel approach that tackles the problem by matching implicit and explicit factors.
We demonstrate that our proposed method obtains comparable generative performance to diffusion-based models and vastly superior results to models with a small number of sampling steps.
arXiv Detail & Related papers (2023-06-21T18:49:22Z) - $\infty$-Diff: Infinite Resolution Diffusion with Subsampled Mollified
States [13.75813166759549]
$infty$-Diff is a generative diffusion model defined in an infinite-dimensional Hilbert space.
By training on randomly sampled subsets of coordinates, we learn a continuous function for arbitrary resolution sampling.
arXiv Detail & Related papers (2023-03-31T17:58:08Z) - Distributed, partially collapsed MCMC for Bayesian Nonparametrics [68.5279360794418]
We exploit the fact that completely random measures, which commonly used models like the Dirichlet process and the beta-Bernoulli process can be expressed as, are decomposable into independent sub-measures.
We use this decomposition to partition the latent measure into a finite measure containing only instantiated components, and an infinite measure containing all other components.
The resulting hybrid algorithm can be applied to allow scalable inference without sacrificing convergence guarantees.
arXiv Detail & Related papers (2020-01-15T23:10:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.