The Score-Difference Flow for Implicit Generative Modeling
- URL: http://arxiv.org/abs/2304.12906v2
- Date: Tue, 18 Jul 2023 15:31:25 GMT
- Title: The Score-Difference Flow for Implicit Generative Modeling
- Authors: Romann M. Weber
- Abstract summary: Implicit generative modeling aims to produce samples of synthetic data matching a target data distribution.
Recent work has approached the IGM problem from the perspective of pushing synthetic source data toward the target distribution.
We present the score difference between arbitrary target and source distributions as a flow that optimally reduces the Kullback-Leibler divergence between them.
- Score: 1.309716118537215
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Implicit generative modeling (IGM) aims to produce samples of synthetic data
matching the characteristics of a target data distribution. Recent work (e.g.
score-matching networks, diffusion models) has approached the IGM problem from
the perspective of pushing synthetic source data toward the target distribution
via dynamical perturbations or flows in the ambient space. In this direction,
we present the score difference (SD) between arbitrary target and source
distributions as a flow that optimally reduces the Kullback-Leibler divergence
between them while also solving the Schroedinger bridge problem. We apply the
SD flow to convenient proxy distributions, which are aligned if and only if the
original distributions are aligned. We demonstrate the formal equivalence of
this formulation to denoising diffusion models under certain conditions. We
also show that the training of generative adversarial networks includes a
hidden data-optimization sub-problem, which induces the SD flow under certain
choices of loss function when the discriminator is optimal. As a result, the SD
flow provides a theoretical link between model classes that individually
address the three challenges of the "generative modeling trilemma" -- high
sample quality, mode coverage, and fast sampling -- thereby setting the stage
for a unified approach.
Related papers
- Diffusion Models as Network Optimizers: Explorations and Analysis [71.69869025878856]
generative diffusion models (GDMs) have emerged as a promising new approach to network optimization.
In this study, we first explore the intrinsic characteristics of generative models.
We provide a concise theoretical and intuitive demonstration of the advantages of generative models over discriminative network optimization.
arXiv Detail & Related papers (2024-11-01T09:05:47Z) - Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional general score-mismatched diffusion samplers.
We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.
This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - Constrained Diffusion Models via Dual Training [80.03953599062365]
Diffusion processes are prone to generating samples that reflect biases in a training dataset.
We develop constrained diffusion models by imposing diffusion constraints based on desired distributions.
We show that our constrained diffusion models generate new data from a mixture data distribution that achieves the optimal trade-off among objective and constraints.
arXiv Detail & Related papers (2024-08-27T14:25:42Z) - Theoretical Insights for Diffusion Guidance: A Case Study for Gaussian
Mixture Models [59.331993845831946]
Diffusion models benefit from instillation of task-specific information into the score function to steer the sample generation towards desired properties.
This paper provides the first theoretical study towards understanding the influence of guidance on diffusion models in the context of Gaussian mixture models.
arXiv Detail & Related papers (2024-03-03T23:15:48Z) - Bayesian Flow Networks [4.585102332532472]
This paper introduces Bayesian Flow Networks (BFNs), a new class of generative model in which the parameters of a set of independent distributions are modified with Bayesian inference.
Starting from a simple prior and iteratively updating the two distributions yields a generative procedure similar to the reverse process of diffusion models.
BFNs achieve competitive log-likelihoods for image modelling on dynamically binarized MNIST and CIFAR-10, and outperform all known discrete diffusion models on the text8 character-level language modelling task.
arXiv Detail & Related papers (2023-08-14T09:56:35Z) - DiffFlow: A Unified SDE Framework for Score-Based Diffusion Models and
Generative Adversarial Networks [41.451880167535776]
We propose a unified theoretic framework for explicit generative models (SDMs) and generative adversarial nets (GANs)
Under our unified theoretic framework, we introduce several instantiations of the DiffFLow that provide new algorithms beyond GANs and SDMs with exact likelihood inference.
arXiv Detail & Related papers (2023-07-05T10:00:53Z) - Semi-Implicit Denoising Diffusion Models (SIDDMs) [50.30163684539586]
Existing models such as Denoising Diffusion Probabilistic Models (DDPM) deliver high-quality, diverse samples but are slowed by an inherently high number of iterative steps.
We introduce a novel approach that tackles the problem by matching implicit and explicit factors.
We demonstrate that our proposed method obtains comparable generative performance to diffusion-based models and vastly superior results to models with a small number of sampling steps.
arXiv Detail & Related papers (2023-06-21T18:49:22Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - From Points to Functions: Infinite-dimensional Representations in
Diffusion Models [23.916417852496608]
Diffusion-based generative models learn to iteratively transfer unstructured noise to a complex target distribution.
We show that a combination of information content from different time steps gives a strictly better representation for the downstream task.
arXiv Detail & Related papers (2022-10-25T05:30:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.