Formulating Discrete Probability Flow Through Optimal Transport
- URL: http://arxiv.org/abs/2311.03886v1
- Date: Tue, 7 Nov 2023 11:03:27 GMT
- Title: Formulating Discrete Probability Flow Through Optimal Transport
- Authors: Pengze Zhang, Hubery Yin, Chen Li, Xiaohua Xie
- Abstract summary: We first prove that the continuous probability flow is the Monge optimal transport map under certain conditions, and also present an equivalent evidence for discrete cases.
We then define the discrete probability flow in line with the principles of optimal transport.
Experiments on the synthetic toy dataset and the CIFAR-10 dataset have validated our proposed discrete probability flow.
- Score: 29.213216002178306
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Continuous diffusion models are commonly acknowledged to display a
deterministic probability flow, whereas discrete diffusion models do not. In
this paper, we aim to establish the fundamental theory for the probability flow
of discrete diffusion models. Specifically, we first prove that the continuous
probability flow is the Monge optimal transport map under certain conditions,
and also present an equivalent evidence for discrete cases. In view of these
findings, we are then able to define the discrete probability flow in line with
the principles of optimal transport. Finally, drawing upon our newly
established definitions, we propose a novel sampling method that surpasses
previous discrete diffusion models in its ability to generate more certain
outcomes. Extensive experiments on the synthetic toy dataset and the CIFAR-10
dataset have validated the effectiveness of our proposed discrete probability
flow. Code is released at:
https://github.com/PangzeCheung/Discrete-Probability-Flow.
Related papers
- Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional general score-mismatched diffusion samplers.
We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.
This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - Discrete Flow Matching [74.04153927689313]
We present a novel discrete flow paradigm designed specifically for generating discrete data.
Our approach is capable of generating high-quality discrete data in a non-autoregressive fashion.
arXiv Detail & Related papers (2024-07-22T12:33:27Z) - Generative Assignment Flows for Representing and Learning Joint Distributions of Discrete Data [2.6499018693213316]
We introduce a novel generative model for the representation of joint probability distributions of a possibly large number of discrete random variables.
The embedding of the flow via the Segre map in the meta-simplex of all discrete joint distributions ensures that any target distribution can be represented in principle.
Our approach has strong motivation from first principles of modeling coupled discrete variables.
arXiv Detail & Related papers (2024-06-06T21:58:33Z) - Conditional Pseudo-Reversible Normalizing Flow for Surrogate Modeling in Quantifying Uncertainty Propagation [11.874729463016227]
We introduce a conditional pseudo-reversible normalizing flow for constructing surrogate models of a physical model polluted by additive noise.
The training process utilizes dataset consisting of input-output pairs without requiring prior knowledge about the noise and the function.
Our model, once trained, can generate samples from any conditional probability density functions whose high probability regions are covered by the training set.
arXiv Detail & Related papers (2024-03-31T00:09:58Z) - Theoretical Insights for Diffusion Guidance: A Case Study for Gaussian
Mixture Models [59.331993845831946]
Diffusion models benefit from instillation of task-specific information into the score function to steer the sample generation towards desired properties.
This paper provides the first theoretical study towards understanding the influence of guidance on diffusion models in the context of Gaussian mixture models.
arXiv Detail & Related papers (2024-03-03T23:15:48Z) - Diffusion on the Probability Simplex [24.115365081118604]
Diffusion models learn to reverse the progressive noising of a data distribution to create a generative model.
We propose a method of performing diffusion on the probability simplex.
We find that our methodology naturally extends to include diffusion on the unit cube which has applications for bounded image generation.
arXiv Detail & Related papers (2023-09-05T18:52:35Z) - Diffusion Models are Minimax Optimal Distribution Estimators [49.47503258639454]
We provide the first rigorous analysis on approximation and generalization abilities of diffusion modeling.
We show that when the true density function belongs to the Besov space and the empirical score matching loss is properly minimized, the generated data distribution achieves the nearly minimax optimal estimation rates.
arXiv Detail & Related papers (2023-03-03T11:31:55Z) - Bi-Noising Diffusion: Towards Conditional Diffusion Models with
Generative Restoration Priors [64.24948495708337]
We introduce a new method that brings predicted samples to the training data manifold using a pretrained unconditional diffusion model.
We perform comprehensive experiments to demonstrate the effectiveness of our approach on super-resolution, colorization, turbulence removal, and image-deraining tasks.
arXiv Detail & Related papers (2022-12-14T17:26:35Z) - DensePure: Understanding Diffusion Models towards Adversarial Robustness [110.84015494617528]
We analyze the properties of diffusion models and establish the conditions under which they can enhance certified robustness.
We propose a new method DensePure, designed to improve the certified robustness of a pretrained model (i.e. a classifier)
We show that this robust region is a union of multiple convex sets, and is potentially much larger than the robust regions identified in previous works.
arXiv Detail & Related papers (2022-11-01T08:18:07Z) - Continuous and Distribution-free Probabilistic Wind Power Forecasting: A
Conditional Normalizing Flow Approach [1.684864188596015]
We present a data-driven approach for probabilistic wind power forecasting based on conditional normalizing flow (CNF)
In contrast with the existing, this approach is distribution-free (as for non-parametric and quantile-based approaches) and can directly yield continuous probability densities.
arXiv Detail & Related papers (2022-06-06T08:48:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.