Discrete Adjoint Schrödinger Bridge Sampler
- URL: http://arxiv.org/abs/2602.08243v1
- Date: Mon, 09 Feb 2026 03:41:47 GMT
- Title: Discrete Adjoint Schrödinger Bridge Sampler
- Authors: Wei Guo, Yuchen Zhu, Xiaochen Du, Juno Nam, Yongxin Chen, Rafael Gómez-Bombarelli, Guan-Horng Liu, Molei Tao, Jaemoo Choi,
- Abstract summary: adjoint matching (AM) excel in continuous domains, but remain unexplored for discrete spaces.<n>We introduce $mathbfdiscreteASBS$, a unified framework that extends AM and adjoint Schrdinger bridge sampler (ASBS) to discrete spaces.<n> Empirically, discrete ASBS achieves competitive sample quality with significant advantages in training efficiency and scalability.
- Score: 45.568569543075085
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learning discrete neural samplers is challenging due to the lack of gradients and combinatorial complexity. While stochastic optimal control (SOC) and Schrödinger bridge (SB) provide principled solutions, efficient SOC solvers like adjoint matching (AM), which excel in continuous domains, remain unexplored for discrete spaces. We bridge this gap by revealing that the core mechanism of AM is $\mathit{state}\text{-}\mathit{space~agnostic}$, and introduce $\mathbf{discrete~ASBS}$, a unified framework that extends AM and adjoint Schrödinger bridge sampler (ASBS) to discrete spaces. Theoretically, we analyze the optimality conditions of the discrete SB problem and its connection to SOC, identifying a necessary cyclic group structure on the state space to enable this extension. Empirically, discrete ASBS achieves competitive sample quality with significant advantages in training efficiency and scalability.
Related papers
- Learnable Chernoff Baselines for Inference-Time Alignment [64.81256817158851]
We introduce Learnable Chernoff Baselines as a method for efficiently and approximately sampling from exponentially tilted kernels.<n>We establish total-variation guarantees to the ideal aligned model, and demonstrate in both continuous and discrete diffusion settings that LCB sampling closely matches ideal rejection sampling.
arXiv Detail & Related papers (2026-02-08T00:09:40Z) - Improved Sample Complexity for Full Coverage in Compact and Continuous Spaces [0.0]
We study uniform random sampling on the $d$-dimensional unit hypercube.<n>We derive a sample complexity bound with a logarithmic dependence on the failure probability.<n>Our findings offer a sharper theoretical tool for algorithms that rely on grid-based coverage guarantees.
arXiv Detail & Related papers (2025-11-21T21:06:14Z) - Smoothed Agnostic Learning of Halfspaces over the Hypercube [14.269955880630404]
We introduce a new smoothed learning framework for Boolean inputs, where perturbations are modeled via random bit flips.<n>Under strictly subexponential assumptions on the input distribution, we give an efficient algorithm for learning halfspaces.
arXiv Detail & Related papers (2025-11-21T20:59:11Z) - A Closed-Form Framework for Schrödinger Bridges Between Arbitrary Densities [0.0]
We introduce a unified closed-form framework for representing the dynamics of Schrdinger Bridge systems.<n>We develop a simulation-free algorithm that infers SB dynamics directly from samples of the source and target distributions.<n>This work opens a new direction for efficient and scalable diffusion modeling across scientific and machine learning applications.
arXiv Detail & Related papers (2025-11-11T03:08:26Z) - Entering the Era of Discrete Diffusion Models: A Benchmark for Schrödinger Bridges and Entropic Optimal Transport [46.28885837515665]
We introduce a benchmark for the Schr"odinger bridge (SB) problem on discrete spaces.<n>Our construction yields pairs of probability distributions with analytically known SB solutions, enabling rigorous evaluation.<n>This work provides the first step toward proper evaluation of SB methods on discrete spaces.
arXiv Detail & Related papers (2025-09-27T14:51:07Z) - Adjoint Schrödinger Bridge Sampler [27.07623265593163]
Adjoint Schr"odinger Bridge Sampler (ASBS) is a new diffusion sampler that employs simple and scalable matching-based objectives.<n>ASBS is grounded on a mathematical model -- the Schr"odinger Bridge -- which enhances sampling efficiency via kinetic-optimal transportation.
arXiv Detail & Related papers (2025-06-27T18:27:59Z) - Rethinking Clustered Federated Learning in NOMA Enhanced Wireless
Networks [60.09912912343705]
This study explores the benefits of integrating the novel clustered federated learning (CFL) approach with non-independent and identically distributed (non-IID) datasets.
A detailed theoretical analysis of the generalization gap that measures the degree of non-IID in the data distribution is presented.
Solutions to address the challenges posed by non-IID conditions are proposed with the analysis of the properties.
arXiv Detail & Related papers (2024-03-05T17:49:09Z) - Generalized Schrödinger Bridge Matching [54.171931505066]
Generalized Schr"odinger Bridge (GSB) problem setup is prevalent in many scientific areas both within and without machine learning.
We propose Generalized Schr"odinger Bridge Matching (GSBM), a new matching algorithm inspired by recent advances.
We show that such a generalization can be cast as solving conditional optimal control, for which variational approximations can be used.
arXiv Detail & Related papers (2023-10-03T17:42:11Z) - Transport with Support: Data-Conditional Diffusion Bridges [18.933928516349397]
We introduce the Iterative Smoothing Bridge (ISB) to solve constrained time-series data generation tasks.
We show that the ISB generalises well to high-dimensional data, is computationally efficient, and provides accurate estimates of the marginals at intermediate and terminal times.
arXiv Detail & Related papers (2023-01-31T13:50:16Z) - Optimal Scaling for Locally Balanced Proposals in Discrete Spaces [65.14092237705476]
We show that efficiency of Metropolis-Hastings (M-H) algorithms in discrete spaces can be characterized by an acceptance rate that is independent of the target distribution.
Knowledge of the optimal acceptance rate allows one to automatically tune the neighborhood size of a proposal distribution in a discrete space, directly analogous to step-size control in continuous spaces.
arXiv Detail & Related papers (2022-09-16T22:09:53Z) - Stochastic Gradient Descent-Ascent and Consensus Optimization for Smooth
Games: Convergence Analysis under Expected Co-coercivity [49.66890309455787]
We introduce the expected co-coercivity condition, explain its benefits, and provide the first last-iterate convergence guarantees of SGDA and SCO.
We prove linear convergence of both methods to a neighborhood of the solution when they use constant step-size.
Our convergence guarantees hold under the arbitrary sampling paradigm, and we give insights into the complexity of minibatching.
arXiv Detail & Related papers (2021-06-30T18:32:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.