Pathway to $O(\sqrt{d})$ Complexity bound under Wasserstein metric of flow-based models
- URL: http://arxiv.org/abs/2512.06702v1
- Date: Sun, 07 Dec 2025 07:26:39 GMT
- Title: Pathway to $O(\sqrt{d})$ Complexity bound under Wasserstein metric of flow-based models
- Authors: Xiangjun Meng, Zhongjian Wang,
- Abstract summary: We provide tools to estimate the error of flow-based generative models under the Wasserstein metric.<n>We show this error can be explicitly controlled by two parts: the Lipschitzness of the push-forward maps of the backward flow which scales independently of the dimension.
- Score: 1.724966705006084
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We provide attainable analytical tools to estimate the error of flow-based generative models under the Wasserstein metric and to establish the optimal sampling iteration complexity bound with respect to dimension as $O(\sqrt{d})$. We show this error can be explicitly controlled by two parts: the Lipschitzness of the push-forward maps of the backward flow which scales independently of the dimension; and a local discretization error scales $O(\sqrt{d})$ in terms of dimension. The former one is related to the existence of Lipschitz changes of variables induced by the (heat) flow. The latter one consists of the regularity of the score function in both spatial and temporal directions. These assumptions are valid in the flow-based generative model associated with the Föllmer process and $1$-rectified flow under the Gaussian tail assumption. As a consequence, we show that the sampling iteration complexity grows linearly with the square root of the trace of the covariance operator, which is related to the invariant distribution of the forward process.
Related papers
- Total Variation Rates for Riemannian Flow Matching [8.235086108564998]
We develop a nonasymptotic Total Variation analysis for RFM samplers.<n>Our key technical ingredient is a differential inequality governing the evolution of TV between two manifold ODE flows.
arXiv Detail & Related papers (2026-02-05T01:06:53Z) - Stabilizing Fixed-Point Iteration for Markov Chain Poisson Equations [49.702772230127465]
We study finite-state Markov chains with $n$ states and transition matrix $P$.<n>We show that all non-decaying modes are captured by a real peripheral invariant subspace $mathcalK(P)$, and that the induced operator on the quotient space $mathbbRn/mathcalK(P) is strictly contractive, yielding a unique quotient solution.
arXiv Detail & Related papers (2026-01-31T02:57:01Z) - Order-Optimal Sample Complexity of Rectified Flows [43.61958734990224]
We study rectified flow models, which constrain transport trajectories to be linear from the base distribution to the data distribution.<n>This structural restriction greatly accelerates sampling, often enabling high-quality generation with a single step.
arXiv Detail & Related papers (2026-01-28T04:55:14Z) - Generative Modeling with Continuous Flows: Sample Complexity of Flow Matching [60.37045080890305]
We provide the first analysis of the sample complexity for flow-matching based generative models.<n>We decompose the velocity field estimation error into neural-network approximation error, statistical error due to the finite sample size, and optimization error due to the finite number of optimization steps for estimating the velocity field.
arXiv Detail & Related papers (2025-12-01T05:14:25Z) - Parameter-free Algorithms for the Stochastically Extended Adversarial Model [59.81852138768642]
Existing approaches for the Extended Adversarial (SEA) model require prior knowledge of problem-specific parameters, such as the diameter of the domain $D$ and the Lipschitz constant of the loss functions $G$.<n>We develop parameter-free methods by leveraging the Optimistic Online Newton Step (OONS) algorithm to eliminate the need for these parameters.
arXiv Detail & Related papers (2025-10-06T10:53:37Z) - Fast Convergence for High-Order ODE Solvers in Diffusion Probabilistic Models [5.939858158928473]
Diffusion probabilistic models generate samples by learning to reverse a noise-injection process that transforms data into noise.<n>A key development is the reformulation of the reverse sampling process as a deterministic probability flow ordinary differential equation (ODE)<n>We present a rigorous convergence analysis of deterministic samplers derived from ODEs for general forward processes with arbitrary variance schedules.
arXiv Detail & Related papers (2025-06-16T03:09:25Z) - Wasserstein Bounds for generative diffusion models with Gaussian tail targets [1.540806558564153]
We present an estimate of the Wasserstein distance between the data distribution and the generation of score-based generative models.<n>We assume a Gaussian-type tail behavior of the data distribution and an $epsilon$-accurate approximation of the score.
arXiv Detail & Related papers (2024-12-15T17:20:42Z) - Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - Computational-Statistical Gaps in Gaussian Single-Index Models [77.1473134227844]
Single-Index Models are high-dimensional regression problems with planted structure.
We show that computationally efficient algorithms, both within the Statistical Query (SQ) and the Low-Degree Polynomial (LDP) framework, necessarily require $Omega(dkstar/2)$ samples.
arXiv Detail & Related papers (2024-03-08T18:50:19Z) - Closed-form Filtering for Non-linear Systems [83.91296397912218]
We propose a new class of filters based on Gaussian PSD Models, which offer several advantages in terms of density approximation and computational efficiency.
We show that filtering can be efficiently performed in closed form when transitions and observations are Gaussian PSD Models.
Our proposed estimator enjoys strong theoretical guarantees, with estimation error that depends on the quality of the approximation and is adaptive to the regularity of the transition probabilities.
arXiv Detail & Related papers (2024-02-15T08:51:49Z) - Sampling and estimation on manifolds using the Langevin diffusion [45.57801520690309]
Two estimators of linear functionals of $mu_phi $ based on the discretized Markov process are considered.<n>Error bounds are derived for sampling and estimation using a discretization of an intrinsically defined Langevin diffusion.
arXiv Detail & Related papers (2023-12-22T18:01:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.