Unraveling the Connections between Flow Matching and Diffusion Probabilistic Models in Training-free Conditional Generation
- URL: http://arxiv.org/abs/2411.07625v1
- Date: Tue, 12 Nov 2024 08:14:39 GMT
- Title: Unraveling the Connections between Flow Matching and Diffusion Probabilistic Models in Training-free Conditional Generation
- Authors: Kaiyu Song, Hanjiang Lai,
- Abstract summary: Flow-matching (FM) and diffusion probabilistic models (DPMs) are two mature unconditional diffusion models.
We show that a probabilistic diffusion path can be associated with the FM and DPMs.
We propose two posterior sampling methods to estimate the conditional term and achieve a training-free conditional generation of FM.
- Score: 7.3604864243987365
- License:
- Abstract: Training-free conditional generation aims to leverage the unconditional diffusion models to implement the conditional generation, where flow-matching (FM) and diffusion probabilistic models (DPMs) are two mature unconditional diffusion models that achieve high-quality generation. Two questions were asked in this paper: What are the underlying connections between FM and DPMs in training-free conditional generation? Can we leverage DPMs to improve the training-free conditional generation for FM? We first show that a probabilistic diffusion path can be associated with the FM and DPMs. Then, we reformulate the ordinary differential equation (ODE) of FM based on the score function of DPMs, and thus, the conditions in FM can be incorporated as those in DPMs. Finally, we propose two posterior sampling methods to estimate the conditional term and achieve a training-free conditional generation of FM. Experimental results show that our proposed method could be implemented for various conditional generation tasks. Our method can generate higher-quality results than the state-of-the-art methods.
Related papers
- Local Flow Matching Generative Models [19.859984725284896]
Flow Matching (FM) is a simulation-free method for learning a continuous and invertible flow to interpolate between two distributions.
We introduce Local Flow Matching (LFM), which learns a sequence of FM sub-models and each matches a diffusion process up to the time of the step size in the data-to-noise direction.
In experiments, we demonstrate the improved training efficiency and competitive generative performance of LFM compared to FM.
arXiv Detail & Related papers (2024-10-03T14:53:10Z) - Your Absorbing Discrete Diffusion Secretly Models the Conditional Distributions of Clean Data [55.54827581105283]
We show that the concrete score in absorbing diffusion can be expressed as conditional probabilities of clean data.
We propose a dedicated diffusion model without time-condition that characterizes the time-independent conditional probabilities.
Our models achieve SOTA performance among diffusion models on 5 zero-shot language modeling benchmarks.
arXiv Detail & Related papers (2024-06-06T04:22:11Z) - Flow matching achieves almost minimax optimal convergence [50.38891696297888]
Flow matching (FM) has gained significant attention as a simulation-free generative model.
This paper discusses the convergence properties of FM for large sample size under the $p$-Wasserstein distance.
We establish that FM can achieve an almost minimax optimal convergence rate for $1 leq p leq 2$, presenting the first theoretical evidence that FM can reach convergence rates comparable to those of diffusion models.
arXiv Detail & Related papers (2024-05-31T14:54:51Z) - Deep MMD Gradient Flow without adversarial training [69.76417786943217]
We propose a gradient flow procedure for generative modeling by transporting particles from an initial source distribution to a target distribution.
The noise-adaptive Wasserstein Gradient of the Maximum Mean Discrepancy (MMD) is trained on data distributions corrupted by increasing levels of noise.
We demonstrate the validity of the approach when MMD is replaced by a lower bound on the KL divergence.
arXiv Detail & Related papers (2024-05-10T19:10:45Z) - Extended Flow Matching: a Method of Conditional Generation with Generalized Continuity Equation [19.71452214879951]
conditional generation is one of the most important applications of generative models.
We show that we can introduce inductive bias to the conditional generation through the matrix field.
We will present our theory along with experimental results that support the competitiveness of EFM in conditional generation.
arXiv Detail & Related papers (2024-02-29T04:12:32Z) - DiffFlow: A Unified SDE Framework for Score-Based Diffusion Models and
Generative Adversarial Networks [41.451880167535776]
We propose a unified theoretic framework for explicit generative models (SDMs) and generative adversarial nets (GANs)
Under our unified theoretic framework, we introduce several instantiations of the DiffFLow that provide new algorithms beyond GANs and SDMs with exact likelihood inference.
arXiv Detail & Related papers (2023-07-05T10:00:53Z) - Diff-Instruct: A Universal Approach for Transferring Knowledge From
Pre-trained Diffusion Models [77.83923746319498]
We propose a framework called Diff-Instruct to instruct the training of arbitrary generative models.
We show that Diff-Instruct results in state-of-the-art single-step diffusion-based models.
Experiments on refining GAN models show that the Diff-Instruct can consistently improve the pre-trained generators of GAN models.
arXiv Detail & Related papers (2023-05-29T04:22:57Z) - ShiftDDPMs: Exploring Conditional Diffusion Models by Shifting Diffusion
Trajectories [144.03939123870416]
We propose a novel conditional diffusion model by introducing conditions into the forward process.
We use extra latent space to allocate an exclusive diffusion trajectory for each condition based on some shifting rules.
We formulate our method, which we call textbfShiftDDPMs, and provide a unified point of view on existing related methods.
arXiv Detail & Related papers (2023-02-05T12:48:21Z) - Improving and generalizing flow-based generative models with minibatch
optimal transport [90.01613198337833]
We introduce the generalized conditional flow matching (CFM) technique for continuous normalizing flows (CNFs)
CFM features a stable regression objective like that used to train the flow in diffusion models but enjoys the efficient inference of deterministic flow models.
A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference.
arXiv Detail & Related papers (2023-02-01T14:47:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.