Progressive Tempering Sampler with Diffusion
- URL: http://arxiv.org/abs/2506.05231v2
- Date: Mon, 20 Oct 2025 09:43:29 GMT
- Title: Progressive Tempering Sampler with Diffusion
- Authors: Severi Rissanen, RuiKang OuYang, Jiajun He, Wenlin Chen, Markus Heinonen, Arno Solin, José Miguel Hernández-Lobato,
- Abstract summary: We propose a neural sampler that trains diffusion models sequentially across temperatures.<n>We also introduce a novel method to combine high-temperature diffusion models to generate approximate lower-temperature samples.<n>Our method significantly improves target evaluation efficiency, outperforming diffusion-based neural samplers.
- Score: 50.06039228068602
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent research has focused on designing neural samplers that amortize the process of sampling from unnormalized densities. However, despite significant advancements, they still fall short of the state-of-the-art MCMC approach, Parallel Tempering (PT), when it comes to the efficiency of target evaluations. On the other hand, unlike a well-trained neural sampler, PT yields only dependent samples and needs to be rerun -- at considerable computational cost -- whenever new samples are required. To address these weaknesses, we propose the Progressive Tempering Sampler with Diffusion (PTSD), which trains diffusion models sequentially across temperatures, leveraging the advantages of PT to improve the training of neural samplers. We also introduce a novel method to combine high-temperature diffusion models to generate approximate lower-temperature samples, which are minimally refined using MCMC and used to train the next diffusion model. PTSD enables efficient reuse of sample information across temperature levels while generating well-mixed, uncorrelated samples. Our method significantly improves target evaluation efficiency, outperforming diffusion-based neural samplers.
Related papers
- Learning To Sample From Diffusion Models Via Inverse Reinforcement Learning [43.678382510171986]
Diffusion models generate samples through an iterative denoising process, guided by a neural network.<n>We introduce an inverse reinforcement learning framework for learning sampling strategies without retraining the denoiser.<n>We provide experimental evidence that this approach can improve the quality of samples generated by pretrained diffusion models.
arXiv Detail & Related papers (2026-02-09T14:10:44Z) - Progressive Inference-Time Annealing of Diffusion Models for Sampling from Boltzmann Densities [85.83359661628575]
We propose Progressive Inference-Time Annealing (PITA) to learn diffusion-based samplers.<n>PITA combines two complementary techniques: Annealing of the Boltzmann distribution and Diffusion smoothing.<n>It enables equilibrium sampling of N-body particle systems, Alanine Dipeptide, and tripeptides in Cartesian coordinates.
arXiv Detail & Related papers (2025-06-19T17:14:22Z) - On scalable and efficient training of diffusion samplers [26.45926098524023]
We address the challenge of training diffusion models to sample from unnormalized energy distributions in the absence of data.<n>We propose a scalable and sample-efficient framework that properly harmonizes the powerful classical sampling method and the diffusion sampler.<n>Our method significantly improves sample efficiency on standard benchmarks for diffusion samplers and also excels at higher-dimensional problems and real-world molecular conformer generation.
arXiv Detail & Related papers (2025-05-26T06:16:34Z) - Inference-Time Diffusion Model Distillation [59.350789627086456]
We introduce Distillation++, a novel inference-time distillation framework.<n>Inspired by recent advances in conditional sampling, our approach recasts student model sampling as a proximal optimization problem.<n>We integrate distillation optimization during reverse sampling, which can be viewed as teacher guidance.
arXiv Detail & Related papers (2024-12-12T02:07:17Z) - EM Distillation for One-step Diffusion Models [65.57766773137068]
We propose a maximum likelihood-based approach that distills a diffusion model to a one-step generator model with minimal loss of quality.<n>We develop a reparametrized sampling scheme and a noise cancellation technique that together stabilizes the distillation process.
arXiv Detail & Related papers (2024-05-27T05:55:22Z) - Iterated Denoising Energy Matching for Sampling from Boltzmann Densities [109.23137009609519]
Iterated Denoising Energy Matching (iDEM)
iDEM alternates between (I) sampling regions of high model density from a diffusion-based sampler and (II) using these samples in our matching objective.
We show that the proposed approach achieves state-of-the-art performance on all metrics and trains $2-5times$ faster.
arXiv Detail & Related papers (2024-02-09T01:11:23Z) - Boosting Diffusion Models with an Adaptive Momentum Sampler [21.88226514633627]
We present a novel reverse sampler for DPMs inspired by the widely-used Adam sampler.
Our proposed sampler can be readily applied to a pre-trained diffusion model.
By implicitly reusing update directions from early steps, our proposed sampler achieves a better balance between high-level semantics and low-level details.
arXiv Detail & Related papers (2023-08-23T06:22:02Z) - Entropy-based Training Methods for Scalable Neural Implicit Sampler [20.35664492719671]
In this paper, we introduce an efficient and scalable implicit neural sampler that overcomes limitations.<n>The implicit sampler can generate large batches of samples with low computational costs.<n>By employing the two training methods, we effectively optimize the neural implicit samplers to learn and generate from the desired target distribution.
arXiv Detail & Related papers (2023-06-08T05:56:05Z) - Denoising Diffusion Implicit Models [117.03720513930335]
We present denoising diffusion implicit models (DDIMs) for iterative implicit probabilistic models with the same training procedure as DDPMs.
DDIMs can produce high quality samples $10 times$ to $50 times$ faster in terms of wall-clock time compared to DDPMs.
arXiv Detail & Related papers (2020-10-06T06:15:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.