Adjoint Schrödinger Bridge Sampler
- URL: http://arxiv.org/abs/2506.22565v1
- Date: Fri, 27 Jun 2025 18:27:59 GMT
- Title: Adjoint Schrödinger Bridge Sampler
- Authors: Guan-Horng Liu, Jaemoo Choi, Yongxin Chen, Benjamin Kurt Miller, Ricky T. Q. Chen,
- Abstract summary: Adjoint Schr"odinger Bridge Sampler (ASBS) is a new diffusion sampler that employs simple and scalable matching-based objectives.<n>ASBS is grounded on a mathematical model -- the Schr"odinger Bridge -- which enhances sampling efficiency via kinetic-optimal transportation.
- Score: 27.07623265593163
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Computational methods for learning to sample from the Boltzmann distribution -- where the target distribution is known only up to an unnormalized energy function -- have advanced significantly recently. Due to the lack of explicit target samples, however, prior diffusion-based methods, known as diffusion samplers, often require importance-weighted estimation or complicated learning processes. Both trade off scalability with extensive evaluations of the energy and model, thereby limiting their practical usage. In this work, we propose Adjoint Schr\"odinger Bridge Sampler (ASBS), a new diffusion sampler that employs simple and scalable matching-based objectives yet without the need to estimate target samples during training. ASBS is grounded on a mathematical model -- the Schr\"odinger Bridge -- which enhances sampling efficiency via kinetic-optimal transportation. Through a new lens of stochastic optimal control theory, we demonstrate how SB-based diffusion samplers can be learned at scale via Adjoint Matching and prove convergence to the global solution. Notably, ASBS generalizes the recent Adjoint Sampling (Havens et al., 2025) to arbitrary source distributions by relaxing the so-called memoryless condition that largely restricts the design space. Through extensive experiments, we demonstrate the effectiveness of ASBS on sampling from classical energy functions, amortized conformer generation, and molecular Boltzmann distributions.
Related papers
- Non-equilibrium Annealed Adjoint Sampler [27.73022309947818]
We introduce the textbfNon-equilibrium Annealed Adjoint Sampler (NAAS), a novel SOC-based diffusion sampler.<n> NAAS employs a lean adjoint system inspired by adjoint matching, enabling efficient and scalable training.
arXiv Detail & Related papers (2025-06-22T20:41:31Z) - Progressive Inference-Time Annealing of Diffusion Models for Sampling from Boltzmann Densities [85.83359661628575]
We propose Progressive Inference-Time Annealing (PITA) to learn diffusion-based samplers.<n>PITA combines two complementary techniques: Annealing of the Boltzmann distribution and Diffusion smoothing.<n>It enables equilibrium sampling of N-body particle systems, Alanine Dipeptide, and tripeptides in Cartesian coordinates.
arXiv Detail & Related papers (2025-06-19T17:14:22Z) - On scalable and efficient training of diffusion samplers [26.45926098524023]
We address the challenge of training diffusion models to sample from unnormalized energy distributions in the absence of data.<n>We propose a scalable and sample-efficient framework that properly harmonizes the powerful classical sampling method and the diffusion sampler.<n>Our method significantly improves sample efficiency on standard benchmarks for diffusion samplers and also excels at higher-dimensional problems and real-world molecular conformer generation.
arXiv Detail & Related papers (2025-05-26T06:16:34Z) - Adjoint Sampling: Highly Scalable Diffusion Samplers via Adjoint Matching [33.9461078261722]
We introduce Adjoint Sampling, a highly scalable and efficient algorithm for learning diffusion processes that sample from unnormalized densities.<n>We show how to incorporate key symmetries, as well as periodic boundary conditions, for modeling molecules in both cartesian and torsional coordinates.<n>We demonstrate the effectiveness of our approach through extensive experiments on classical energy functions, and further scale up to neural network-based energy models.
arXiv Detail & Related papers (2025-04-16T02:20:06Z) - Accelerated Diffusion Models via Speculative Sampling [89.43940130493233]
Speculative sampling is a popular technique for accelerating inference in Large Language Models.<n>We extend speculative sampling to diffusion models, which generate samples via continuous, vector-valued Markov chains.<n>We propose various drafting strategies, including a simple and effective approach that does not require training a draft model.
arXiv Detail & Related papers (2025-01-09T16:50:16Z) - Enhanced Importance Sampling through Latent Space Exploration in Normalizing Flows [69.8873421870522]
importance sampling is a rare event simulation technique used in Monte Carlo simulations.<n>We propose a method for more efficient sampling by updating the proposal distribution in the latent space of a normalizing flow.
arXiv Detail & Related papers (2025-01-06T21:18:02Z) - Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional dependencies for general score-mismatched diffusion samplers.<n>We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.<n>This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - Iterated Denoising Energy Matching for Sampling from Boltzmann Densities [109.23137009609519]
Iterated Denoising Energy Matching (iDEM)
iDEM alternates between (I) sampling regions of high model density from a diffusion-based sampler and (II) using these samples in our matching objective.
We show that the proposed approach achieves state-of-the-art performance on all metrics and trains $2-5times$ faster.
arXiv Detail & Related papers (2024-02-09T01:11:23Z) - Improved off-policy training of diffusion samplers [93.66433483772055]
We study the problem of training diffusion models to sample from a distribution with an unnormalized density or energy function.<n>We benchmark several diffusion-structured inference methods, including simulation-based variational approaches and off-policy methods.<n>Our results shed light on the relative advantages of existing algorithms while bringing into question some claims from past work.
arXiv Detail & Related papers (2024-02-07T18:51:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.