Particle Guidance: non-I.I.D. Diverse Sampling with Diffusion Models
- URL: http://arxiv.org/abs/2310.13102v2
- Date: Fri, 24 Nov 2023 09:42:21 GMT
- Title: Particle Guidance: non-I.I.D. Diverse Sampling with Diffusion Models
- Authors: Gabriele Corso, Yilun Xu, Valentin de Bortoli, Regina Barzilay, Tommi
Jaakkola
- Abstract summary: We propose particle guidance, an extension of diffusion-based generative sampling where a joint-particle time-evolving potential enforces diversity.
We analyze theoretically the joint distribution that particle guidance generates, how to learn a potential that achieves optimal diversity, and the connections with methods in other disciplines.
- Score: 41.192240810280424
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In light of the widespread success of generative models, a significant amount
of research has gone into speeding up their sampling time. However, generative
models are often sampled multiple times to obtain a diverse set incurring a
cost that is orthogonal to sampling time. We tackle the question of how to
improve diversity and sample efficiency by moving beyond the common assumption
of independent samples. We propose particle guidance, an extension of
diffusion-based generative sampling where a joint-particle time-evolving
potential enforces diversity. We analyze theoretically the joint distribution
that particle guidance generates, how to learn a potential that achieves
optimal diversity, and the connections with methods in other disciplines.
Empirically, we test the framework both in the setting of conditional image
generation, where we are able to increase diversity without affecting quality,
and molecular conformer generation, where we reduce the state-of-the-art median
error by 13% on average.
Related papers
- Accelerated Diffusion Models via Speculative Sampling [89.43940130493233]
Speculative sampling is a popular technique for accelerating inference in Large Language Models.
We extend speculative sampling to diffusion models, which generate samples via continuous, vector-valued Markov chains.
We propose various drafting strategies, including a simple and effective approach that does not require training a draft model.
arXiv Detail & Related papers (2025-01-09T16:50:16Z) - Diverse Rare Sample Generation with Pretrained GANs [24.227852798611025]
This study proposes a novel approach for generating diverse rare samples from high-resolution image datasets with pretrained GANs.
Our method employs gradient-based optimization of latent vectors within a multi-objective framework and utilizes normalizing flows for density estimation on the feature space.
This enables the generation of diverse rare images, with controllable parameters for rarity, diversity, and similarity to a reference image.
arXiv Detail & Related papers (2024-12-27T09:10:30Z) - Zigzag Diffusion Sampling: Diffusion Models Can Self-Improve via Self-Reflection [28.82743020243849]
Existing text-to-image diffusion models often fail to maintain high image quality and high prompt-image alignment for challenging prompts.
We propose diffusion self-reflection that alternately performs denoising and inversion.
We derive Zigzag Diffusion Sampling (Z-Sampling), a novel self-reflection-based diffusion sampling method.
arXiv Detail & Related papers (2024-12-14T16:42:41Z) - Inferring Parameter Distributions in Heterogeneous Motile Particle Ensembles: A Likelihood Approach for Second Order Langevin Models [0.8274836883472768]
Inference methods are required to understand and predict the motion patterns from time discrete trajectory data provided by experiments.
We propose a new method to approximate the likelihood for non-linear second order Langevin models.
We thereby pave the way for the systematic, data-driven inference of dynamical models for actively driven entities.
arXiv Detail & Related papers (2024-11-13T15:27:02Z) - Conditional Synthesis of 3D Molecules with Time Correction Sampler [58.0834973489875]
Time-Aware Conditional Synthesis (TACS) is a novel approach to conditional generation on diffusion models.
It integrates adaptively controlled plug-and-play "online" guidance into a diffusion model, driving samples toward the desired properties.
arXiv Detail & Related papers (2024-11-01T12:59:25Z) - Provable Statistical Rates for Consistency Diffusion Models [87.28777947976573]
Despite the state-of-the-art performance, diffusion models are known for their slow sample generation due to the extensive number of steps involved.
This paper contributes towards the first statistical theory for consistency models, formulating their training as a distribution discrepancy minimization problem.
arXiv Detail & Related papers (2024-06-23T20:34:18Z) - Fast Sampling via Discrete Non-Markov Diffusion Models with Predetermined Transition Time [49.598085130313514]
We propose discrete non-Markov diffusion models (DNDM), which naturally induce the predetermined transition time set.
This enables a training-free sampling algorithm that significantly reduces the number of function evaluations.
We study the transition from finite to infinite step sampling, offering new insights into bridging the gap between discrete and continuous-time processes.
arXiv Detail & Related papers (2023-12-14T18:14:11Z) - Bi-Noising Diffusion: Towards Conditional Diffusion Models with
Generative Restoration Priors [64.24948495708337]
We introduce a new method that brings predicted samples to the training data manifold using a pretrained unconditional diffusion model.
We perform comprehensive experiments to demonstrate the effectiveness of our approach on super-resolution, colorization, turbulence removal, and image-deraining tasks.
arXiv Detail & Related papers (2022-12-14T17:26:35Z) - DLow: Diversifying Latent Flows for Diverse Human Motion Prediction [32.22704734791378]
We propose a novel sampling method, Diversifying Latent Flows (DLow), to produce a diverse set of samples from a pretrained deep generative model.
During training, DLow uses a diversity-promoting prior over samples as an objective to optimize the latent mappings to improve sample diversity.
Our experiments demonstrate that DLow outperforms state-of-the-art baseline methods in terms of sample diversity and accuracy.
arXiv Detail & Related papers (2020-03-18T17:58:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.