Non-equilibrium Annealed Adjoint Sampler
- URL: http://arxiv.org/abs/2506.18165v2
- Date: Wed, 25 Jun 2025 14:39:40 GMT
- Title: Non-equilibrium Annealed Adjoint Sampler
- Authors: Jaemoo Choi, Yongxin Chen, Molei Tao, Guan-Horng Liu,
- Abstract summary: We introduce the textbfNon-equilibrium Annealed Adjoint Sampler (NAAS), a novel SOC-based diffusion sampler.<n> NAAS employs a lean adjoint system inspired by adjoint matching, enabling efficient and scalable training.
- Score: 27.73022309947818
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recently, there has been significant progress in learning-based diffusion samplers, which aim to sample from a given unnormalized density. These methods typically follow one of two paradigms: (i) formulating sampling as an unbiased stochastic optimal control (SOC) problem using a canonical reference process, or (ii) refining annealed path measures through importance-weighted sampling. Although annealing approaches have advantages in guiding samples toward high-density regions, reliance on importance sampling leads to high variance and limited scalability in practice. In this paper, we introduce the \textbf{Non-equilibrium Annealed Adjoint Sampler (NAAS)}, a novel SOC-based diffusion sampler that leverages annealed reference dynamics without resorting to importance sampling. NAAS employs a lean adjoint system inspired by adjoint matching, enabling efficient and scalable training. We demonstrate the effectiveness of our approach across a range of tasks, including sampling from classical energy landscapes and molecular Boltzmann distribution.
Related papers
- Adjoint Schrödinger Bridge Sampler [27.07623265593163]
Adjoint Schr"odinger Bridge Sampler (ASBS) is a new diffusion sampler that employs simple and scalable matching-based objectives.<n>ASBS is grounded on a mathematical model -- the Schr"odinger Bridge -- which enhances sampling efficiency via kinetic-optimal transportation.
arXiv Detail & Related papers (2025-06-27T18:27:59Z) - On scalable and efficient training of diffusion samplers [26.45926098524023]
We address the challenge of training diffusion models to sample from unnormalized energy distributions in the absence of data.<n>We propose a scalable and sample-efficient framework that properly harmonizes the powerful classical sampling method and the diffusion sampler.<n>Our method significantly improves sample efficiency on standard benchmarks for diffusion samplers and also excels at higher-dimensional problems and real-world molecular conformer generation.
arXiv Detail & Related papers (2025-05-26T06:16:34Z) - Sequential Controlled Langevin Diffusions [80.93988625183485]
Two popular methods are (1) Sequential Monte Carlo (SMC), where the transport is performed through successive densities via prescribed Markov chains and resampling steps, and (2) recently developed diffusion-based sampling methods, where a learned dynamical transport is used.<n>We present a principled framework for combining SMC with diffusion-based samplers by viewing both methods in continuous time and considering measures on path space.<n>This culminates in the new Sequential Controlled Langevin Diffusion (SCLD) sampling method, which is able to utilize the benefits of both methods and reaches improved performance on multiple benchmark problems, in many cases using only 10% of the training budget of previous diffusion-
arXiv Detail & Related papers (2024-12-10T00:47:10Z) - NETS: A Non-Equilibrium Transport Sampler [15.58993313831079]
We propose an algorithm, termed the Non-Equilibrium Transport Sampler (NETS)<n>NETS can be viewed as a variant of importance sampling (AIS) based on Jarzynski's equality.<n>We show that this drift is the minimizer of a variety of objective functions, which can all be estimated in an unbiased fashion.
arXiv Detail & Related papers (2024-10-03T17:35:38Z) - Adaptive teachers for amortized samplers [76.88721198565861]
We propose an adaptive training distribution (the teacher) to guide the training of the primary amortized sampler (the student)<n>We validate the effectiveness of this approach in a synthetic environment designed to present an exploration challenge.
arXiv Detail & Related papers (2024-10-02T11:33:13Z) - Iterated Denoising Energy Matching for Sampling from Boltzmann Densities [109.23137009609519]
Iterated Denoising Energy Matching (iDEM)
iDEM alternates between (I) sampling regions of high model density from a diffusion-based sampler and (II) using these samples in our matching objective.
We show that the proposed approach achieves state-of-the-art performance on all metrics and trains $2-5times$ faster.
arXiv Detail & Related papers (2024-02-09T01:11:23Z) - Improved off-policy training of diffusion samplers [93.66433483772055]
We study the problem of training diffusion models to sample from a distribution with an unnormalized density or energy function.<n>We benchmark several diffusion-structured inference methods, including simulation-based variational approaches and off-policy methods.<n>Our results shed light on the relative advantages of existing algorithms while bringing into question some claims from past work.
arXiv Detail & Related papers (2024-02-07T18:51:49Z) - Entropy-based Training Methods for Scalable Neural Implicit Sampler [20.35664492719671]
In this paper, we introduce an efficient and scalable implicit neural sampler that overcomes limitations.<n>The implicit sampler can generate large batches of samples with low computational costs.<n>By employing the two training methods, we effectively optimize the neural implicit samplers to learn and generate from the desired target distribution.
arXiv Detail & Related papers (2023-06-08T05:56:05Z) - Rethinking Collaborative Metric Learning: Toward an Efficient
Alternative without Negative Sampling [156.7248383178991]
Collaborative Metric Learning (CML) paradigm has aroused wide interest in the area of recommendation systems (RS)
We find that negative sampling would lead to a biased estimation of the generalization error.
Motivated by this, we propose an efficient alternative without negative sampling for CML named textitSampling-Free Collaborative Metric Learning (SFCML)
arXiv Detail & Related papers (2022-06-23T08:50:22Z) - Jo-SRC: A Contrastive Approach for Combating Noisy Labels [58.867237220886885]
We propose a noise-robust approach named Jo-SRC (Joint Sample Selection and Model Regularization based on Consistency)
Specifically, we train the network in a contrastive learning manner. Predictions from two different views of each sample are used to estimate its "likelihood" of being clean or out-of-distribution.
arXiv Detail & Related papers (2021-03-24T07:26:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.