Poisson Flow Generative Models
- URL: http://arxiv.org/abs/2209.11178v1
- Date: Thu, 22 Sep 2022 17:26:58 GMT
- Title: Poisson Flow Generative Models
- Authors: Yilun Xu, Ziming Liu, Max Tegmark, Tommi Jaakkola
- Abstract summary: "Poisson flow" generative model maps a uniform distribution on a high-dimensional hemisphere into any data distribution.
PFGM achieves current state-of-the-art performance among the normalizing flow models on CIFAR-10.
- Score: 9.843778728210427
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a new "Poisson flow" generative model (PFGM) that maps a uniform
distribution on a high-dimensional hemisphere into any data distribution. We
interpret the data points as electrical charges on the $z=0$ hyperplane in a
space augmented with an additional dimension $z$, generating a high-dimensional
electric field (the gradient of the solution to Poisson equation). We prove
that if these charges flow upward along electric field lines, their initial
distribution in the $z=0$ plane transforms into a distribution on the
hemisphere of radius $r$ that becomes uniform in the $r \to\infty$ limit. To
learn the bijective transformation, we estimate the normalized field in the
augmented space. For sampling, we devise a backward ODE that is anchored by the
physically meaningful additional dimension: the samples hit the unaugmented
data manifold when the $z$ reaches zero. Experimentally, PFGM achieves current
state-of-the-art performance among the normalizing flow models on CIFAR-10,
with an Inception score of $9.68$ and a FID score of $2.48$. It also performs
on par with the state-of-the-art SDE approaches while offering $10\times $ to
$20 \times$ acceleration on image generation tasks. Additionally, PFGM appears
more tolerant of estimation errors on a weaker network architecture and robust
to the step size in the Euler method. The code is available at
https://github.com/Newbeeer/poisson_flow .
Related papers
- Monge-Ampere Regularization for Learning Arbitrary Shapes from Point Clouds [69.69726932986923]
We propose the scaled-squared distance function (S$2$DF), a novel implicit surface representation for modeling arbitrary surface types.
S$2$DF does not distinguish between inside and outside regions while effectively addressing the non-differentiability issue of UDF at the zero level set.
We demonstrate that S$2$DF satisfies a second-order partial differential equation of Monge-Ampere-type.
arXiv Detail & Related papers (2024-10-24T06:56:34Z) - A Sharp Convergence Theory for The Probability Flow ODEs of Diffusion Models [45.60426164657739]
We develop non-asymptotic convergence theory for a diffusion-based sampler.
We prove that $d/varepsilon$ are sufficient to approximate the target distribution to within $varepsilon$ total-variation distance.
Our results also characterize how $ell$ score estimation errors affect the quality of the data generation processes.
arXiv Detail & Related papers (2024-08-05T09:02:24Z) - Convergence Analysis of Probability Flow ODE for Score-based Generative Models [5.939858158928473]
We study the convergence properties of deterministic samplers based on probability flow ODEs from both theoretical and numerical perspectives.
We prove the total variation between the target and the generated data distributions can be bounded above by $mathcalO(d3/4delta1/2)$ in the continuous time level.
arXiv Detail & Related papers (2024-04-15T12:29:28Z) - Closed-form Filtering for Non-linear Systems [83.91296397912218]
We propose a new class of filters based on Gaussian PSD Models, which offer several advantages in terms of density approximation and computational efficiency.
We show that filtering can be efficiently performed in closed form when transitions and observations are Gaussian PSD Models.
Our proposed estimator enjoys strong theoretical guarantees, with estimation error that depends on the quality of the approximation and is adaptive to the regularity of the transition probabilities.
arXiv Detail & Related papers (2024-02-15T08:51:49Z) - Nearly $d$-Linear Convergence Bounds for Diffusion Models via Stochastic
Localization [40.808942894229325]
We provide the first convergence bounds which are linear in the data dimension.
We show that diffusion models require at most $tilde O(fracd log2(1/delta)varepsilon2)$ steps to approximate an arbitrary distribution.
arXiv Detail & Related papers (2023-08-07T16:01:14Z) - Effective Minkowski Dimension of Deep Nonparametric Regression: Function
Approximation and Statistical Theories [70.90012822736988]
Existing theories on deep nonparametric regression have shown that when the input data lie on a low-dimensional manifold, deep neural networks can adapt to intrinsic data structures.
This paper introduces a relaxed assumption that input data are concentrated around a subset of $mathbbRd$ denoted by $mathcalS$, and the intrinsic dimension $mathcalS$ can be characterized by a new complexity notation -- effective Minkowski dimension.
arXiv Detail & Related papers (2023-06-26T17:13:31Z) - Towards Faster Non-Asymptotic Convergence for Diffusion-Based Generative
Models [49.81937966106691]
We develop a suite of non-asymptotic theory towards understanding the data generation process of diffusion models.
In contrast to prior works, our theory is developed based on an elementary yet versatile non-asymptotic approach.
arXiv Detail & Related papers (2023-06-15T16:30:08Z) - PFGM++: Unlocking the Potential of Physics-Inspired Generative Models [14.708385906024546]
We introduce a new family of physics-inspired generative models termed PFGM++.
These models realize generative trajectories for $N$ dimensional data by embedding paths in $N+D$ dimensional space.
We show that models with finite $D$ can be superior to previous state-of-the-art diffusion models.
arXiv Detail & Related papers (2023-02-08T18:58:02Z) - Optimal Robust Linear Regression in Nearly Linear Time [97.11565882347772]
We study the problem of high-dimensional robust linear regression where a learner is given access to $n$ samples from the generative model $Y = langle X,w* rangle + epsilon$
We propose estimators for this problem under two settings: (i) $X$ is L4-L2 hypercontractive, $mathbbE [XXtop]$ has bounded condition number and $epsilon$ has bounded variance and (ii) $X$ is sub-Gaussian with identity second moment and $epsilon$ is
arXiv Detail & Related papers (2020-07-16T06:44:44Z) - Gravitational-wave parameter estimation with autoregressive neural
network flows [0.0]
We introduce the use of autoregressive normalizing flows for rapid likelihood-free inference of binary black hole system parameters from gravitational-wave data with deep neural networks.
A normalizing flow is an invertible mapping on a sample space that can be used to induce a transformation from a simple probability distribution to a more complex one.
We build a more powerful latent variable model by incorporating autoregressive flows within the variational autoencoder framework.
arXiv Detail & Related papers (2020-02-18T15:44:04Z) - Curse of Dimensionality on Randomized Smoothing for Certifiable
Robustness [151.67113334248464]
We show that extending the smoothing technique to defend against other attack models can be challenging.
We present experimental results on CIFAR to validate our theory.
arXiv Detail & Related papers (2020-02-08T22:02:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.