Exponential Convergence Guarantees for Iterative Markovian Fitting
- URL: http://arxiv.org/abs/2510.20871v1
- Date: Thu, 23 Oct 2025 08:34:04 GMT
- Title: Exponential Convergence Guarantees for Iterative Markovian Fitting
- Authors: Marta Gentiloni Silveri, Giovanni Conforti, Alain Durmus,
- Abstract summary: We provide the first non-asymptotic exponential convergence guarantees for Iterative Proportional Fitting (IMF)<n>Our results encompass two key regimes: one where the marginals are log-concave, and another where they are weakly log-concave.<n>The analysis relies on new contraction results for the Markovian projection operator and paves the way to theoretical guarantees for DSBM.
- Score: 15.697611355182005
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The Schr\"odinger Bridge (SB) problem has become a fundamental tool in computational optimal transport and generative modeling. To address this problem, ideal methods such as Iterative Proportional Fitting and Iterative Markovian Fitting (IMF) have been proposed-alongside practical approximations like Diffusion Schr\"odinger Bridge and its Matching (DSBM) variant. While previous work have established asymptotic convergence guarantees for IMF, a quantitative, non-asymptotic understanding remains unknown. In this paper, we provide the first non-asymptotic exponential convergence guarantees for IMF under mild structural assumptions on the reference measure and marginal distributions, assuming a sufficiently large time horizon. Our results encompass two key regimes: one where the marginals are log-concave, and another where they are weakly log-concave. The analysis relies on new contraction results for the Markovian projection operator and paves the way to theoretical guarantees for DSBM.
Related papers
- Plug-and-Play Diffusion Meets ADMM: Dual-Variable Coupling for Robust Medical Image Reconstruction [45.25461515976432]
Plug-and-Play diffusion prior (DP) frameworks have emerged as a powerful paradigm for imaging reconstruction.<n>We present a novel approach to resolving bias-hallucination trade-off, achieving state-of-the-art gradients with significantly accelerated convergence.
arXiv Detail & Related papers (2026-02-26T16:58:43Z) - Sharp Convergence Rates for Masked Diffusion Models [53.117058231393834]
We develop a total-variation based analysis for the Euler method that overcomes limitations.<n>Our results relax assumptions on score estimation, improve parameter dependencies, and establish convergence guarantees.<n>Overall, our analysis introduces a direct TV-based error decomposition along the CTMC trajectory and a decoupling-based path-wise analysis for FHS.
arXiv Detail & Related papers (2026-02-26T00:47:51Z) - Fast Model Selection and Stable Optimization for Softmax-Gated Multinomial-Logistic Mixture of Experts Models [40.216463162163976]
We develop a batch minorization-maximization algorithm for softmax-gated multinomial-logistic MoE.<n>We also prove finite-sample rates for conditional density estimation and parameter recovery.<n>Experiments on biological protein--protein interaction prediction validate the full pipeline.
arXiv Detail & Related papers (2026-02-08T14:45:41Z) - Unified Inference Framework for Single and Multi-Player Performative Prediction: Method and Asymptotic Optimality [15.289993502701305]
This paper introduces a unified statistical inference framework that bridges single-agent and multi-agent performativity.<n>It provides a principled toolkit for reliable estimation and decision-making in dynamic, performative environments.
arXiv Detail & Related papers (2026-02-03T03:17:54Z) - The Procrustean Bed of Time Series: The Optimization Bias of Point-wise Loss [53.542743390809356]
This paper aims to provide a first-principles analysis of the Expectation of Optimization Bias (EOB)<n>Our analysis reveals a fundamental paradigm paradox: the more deterministic and structured the time series, the more severe the bias by point-wise loss function.<n>We present a concrete solution that simultaneously achieves both principles via DFT or DWT.
arXiv Detail & Related papers (2025-12-21T06:08:22Z) - Fractional Diffusion Bridge Models [31.569955540169104]
Fractional Diffusion Bridge Models (FDBM)<n>We present a novel generative diffusion bridge framework driven by an approximation of the rich and non-Markovian fractional Brownian motion (fBM)
arXiv Detail & Related papers (2025-11-03T17:51:10Z) - Non-asymptotic error bounds for probability flow ODEs under weak log-concavity [6.661419982187023]
This work establishes non-asymptotic convergence bounds in the 2-Wasserstein distance for a general class of probability flow ODEs.<n>Our results extend convergence theory to more realistic data distributions and practical ODE solvers.
arXiv Detail & Related papers (2025-10-20T14:54:38Z) - Exponential convergence rate for Iterative Markovian Fitting [58.760054965084656]
Iterative Markovian Fitting (IMF) algorithm converges in Kullback-Leibler divergence to the ground truth solution.<n>We establish for the first time that IMF exhibits exponential convergence with an explicit contraction factor.
arXiv Detail & Related papers (2025-08-04T09:48:21Z) - Diffusion & Adversarial Schrödinger Bridges via Iterative Proportional Markovian Fitting [87.37278888311839]
Iterative Markovian Fitting (IMF) procedure successfully solves the Schr"odinger Bridge (SB) problem.<n>We show a close connection between IMF and the Iterative Proportional Fitting (IPF) procedure.<n>We refer to this combined approach as the Iterative Proportional Markovian Fitting (IPMF) procedure.
arXiv Detail & Related papers (2024-10-03T15:43:17Z) - Stochastic Gradient Descent-Ascent and Consensus Optimization for Smooth
Games: Convergence Analysis under Expected Co-coercivity [49.66890309455787]
We introduce the expected co-coercivity condition, explain its benefits, and provide the first last-iterate convergence guarantees of SGDA and SCO.
We prove linear convergence of both methods to a neighborhood of the solution when they use constant step-size.
Our convergence guarantees hold under the arbitrary sampling paradigm, and we give insights into the complexity of minibatching.
arXiv Detail & Related papers (2021-06-30T18:32:46Z) - On Centralized and Distributed Mirror Descent: Exponential Convergence
Analysis Using Quadratic Constraints [8.336315962271396]
Mirror descent (MD) is a powerful first-order optimization technique that subsumes several algorithms including gradient descent (GD)
We study the exact convergence rate of MD in both centralized and distributed cases for strongly convex and smooth problems.
arXiv Detail & Related papers (2021-05-29T23:05:56Z) - A Unified Joint Maximum Mean Discrepancy for Domain Adaptation [73.44809425486767]
This paper theoretically derives a unified form of JMMD that is easy to optimize.
From the revealed unified JMMD, we illustrate that JMMD degrades the feature-label dependence that benefits to classification.
We propose a novel MMD matrix to promote the dependence, and devise a novel label kernel that is robust to label distribution shift.
arXiv Detail & Related papers (2021-01-25T09:46:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.