DiffFlow: A Unified SDE Framework for Score-Based Diffusion Models and
Generative Adversarial Networks
- URL: http://arxiv.org/abs/2307.02159v1
- Date: Wed, 5 Jul 2023 10:00:53 GMT
- Title: DiffFlow: A Unified SDE Framework for Score-Based Diffusion Models and
Generative Adversarial Networks
- Authors: Jingwei Zhang, Han Shi, Jincheng Yu, Enze Xie, and Zhenguo Li
- Abstract summary: We propose a unified theoretic framework for explicit generative models (SDMs) and generative adversarial nets (GANs)
Under our unified theoretic framework, we introduce several instantiations of the DiffFLow that provide new algorithms beyond GANs and SDMs with exact likelihood inference.
- Score: 41.451880167535776
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Generative models can be categorized into two types: explicit generative
models that define explicit density forms and allow exact likelihood inference,
such as score-based diffusion models (SDMs) and normalizing flows; implicit
generative models that directly learn a transformation from the prior to the
data distribution, such as generative adversarial nets (GANs). While these two
types of models have shown great success, they suffer from respective
limitations that hinder them from achieving fast sampling and high sample
quality simultaneously. In this paper, we propose a unified theoretic framework
for SDMs and GANs. We shown that: i) the learning dynamics of both SDMs and
GANs can be described as a novel SDE named Discriminator Denoising Diffusion
Flow (DiffFlow) where the drift can be determined by some weighted combinations
of scores of the real data and the generated data; ii) By adjusting the
relative weights between different score terms, we can obtain a smooth
transition between SDMs and GANs while the marginal distribution of the SDE
remains invariant to the change of the weights; iii) we prove the asymptotic
optimality and maximal likelihood training scheme of the DiffFlow dynamics; iv)
under our unified theoretic framework, we introduce several instantiations of
the DiffFLow that provide new algorithms beyond GANs and SDMs with exact
likelihood inference and have potential to achieve flexible trade-off between
high sample quality and fast sampling speed.
Related papers
- On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Neural Diffusion Models [2.1779479916071067]
We present a generalization of conventional diffusion models that enables defining and learning time-dependent non-linear transformations of data.
NDMs outperform conventional diffusion models in terms of likelihood and produce high-quality samples.
arXiv Detail & Related papers (2023-10-12T13:54:55Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Diff-Instruct: A Universal Approach for Transferring Knowledge From
Pre-trained Diffusion Models [77.83923746319498]
We propose a framework called Diff-Instruct to instruct the training of arbitrary generative models.
We show that Diff-Instruct results in state-of-the-art single-step diffusion-based models.
Experiments on refining GAN models show that the Diff-Instruct can consistently improve the pre-trained generators of GAN models.
arXiv Detail & Related papers (2023-05-29T04:22:57Z) - The Score-Difference Flow for Implicit Generative Modeling [1.309716118537215]
Implicit generative modeling aims to produce samples of synthetic data matching a target data distribution.
Recent work has approached the IGM problem from the perspective of pushing synthetic source data toward the target distribution.
We present the score difference between arbitrary target and source distributions as a flow that optimally reduces the Kullback-Leibler divergence between them.
arXiv Detail & Related papers (2023-04-25T15:21:12Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Fast Inference in Denoising Diffusion Models via MMD Finetuning [23.779985842891705]
We present MMD-DDM, a novel method for fast sampling of diffusion models.
Our approach is based on the idea of using the Maximum Mean Discrepancy (MMD) to finetune the learned distribution with a given budget of timesteps.
Our findings show that the proposed method is able to produce high-quality samples in a fraction of the time required by widely-used diffusion models.
arXiv Detail & Related papers (2023-01-19T09:48:07Z) - Modiff: Action-Conditioned 3D Motion Generation with Denoising Diffusion
Probabilistic Models [58.357180353368896]
We propose a conditional paradigm that benefits from the denoising diffusion probabilistic model (DDPM) to tackle the problem of realistic and diverse action-conditioned 3D skeleton-based motion generation.
We are a pioneering attempt that uses DDPM to synthesize a variable number of motion sequences conditioned on a categorical action.
arXiv Detail & Related papers (2023-01-10T13:15:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.