On gauge freedom, conservativity and intrinsic dimensionality estimation
in diffusion models
- URL: http://arxiv.org/abs/2402.03845v1
- Date: Tue, 6 Feb 2024 09:41:43 GMT
- Title: On gauge freedom, conservativity and intrinsic dimensionality estimation
in diffusion models
- Authors: Christian Horvat and Jean-Pascal Pfister
- Abstract summary: Diffusion models are generative models that have recently demonstrated impressive performances in terms of sampling quality and density estimation in high dimensions.
In the original formulation of the diffusion model, this vector field is assumed to be the score function.
We show that exact density estimation and exact sampling is achieved when the conservative component is exactly equals to the true score.
- Score: 13.597551064547503
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Diffusion models are generative models that have recently demonstrated
impressive performances in terms of sampling quality and density estimation in
high dimensions. They rely on a forward continuous diffusion process and a
backward continuous denoising process, which can be described by a
time-dependent vector field and is used as a generative model. In the original
formulation of the diffusion model, this vector field is assumed to be the
score function (i.e. it is the gradient of the log-probability at a given time
in the diffusion process). Curiously, on the practical side, most studies on
diffusion models implement this vector field as a neural network function and
do not constrain it be the gradient of some energy function (that is, most
studies do not constrain the vector field to be conservative). Even though some
studies investigated empirically whether such a constraint will lead to a
performance gain, they lead to contradicting results and failed to provide
analytical results. Here, we provide three analytical results regarding the
extent of the modeling freedom of this vector field. {Firstly, we propose a
novel decomposition of vector fields into a conservative component and an
orthogonal component which satisfies a given (gauge) freedom. Secondly, from
this orthogonal decomposition, we show that exact density estimation and exact
sampling is achieved when the conservative component is exactly equals to the
true score and therefore conservativity is neither necessary nor sufficient to
obtain exact density estimation and exact sampling. Finally, we show that when
it comes to inferring local information of the data manifold, constraining the
vector field to be conservative is desirable.
Related papers
- Convergence of Diffusion Models Under the Manifold Hypothesis in High-Dimensions [6.9408143976091745]
Denoising Diffusion Probabilistic Models (DDPM) are powerful state-of-the-art methods used to generate synthetic data from high-dimensional data distributions.
We study DDPMs under the manifold hypothesis and prove that they achieve rates independent of the ambient dimension in terms of learning the score.
In terms of sampling, we obtain rates independent of the ambient dimension w.r.t. the Kullback-Leibler divergence, and $O(sqrtD)$ w.r.t. the Wasserstein distance.
arXiv Detail & Related papers (2024-09-27T14:57:18Z) - Latent Space Score-based Diffusion Model for Probabilistic Multivariate Time Series Imputation [6.9295879301090535]
We propose the Latent Space Score-Based Diffusion Model (LSSDM) for probabilistic time series imputation.
LSSDM achieves superior imputation performance while also providing a better explanation and uncertainty analysis of the imputation mechanism.
arXiv Detail & Related papers (2024-09-13T15:32:26Z) - Your Absorbing Discrete Diffusion Secretly Models the Conditional Distributions of Clean Data [55.54827581105283]
We show that the concrete score in absorbing diffusion can be expressed as conditional probabilities of clean data.
We propose a dedicated diffusion model without time-condition that characterizes the time-independent conditional probabilities.
Our models achieve SOTA performance among diffusion models on 5 zero-shot language modeling benchmarks.
arXiv Detail & Related papers (2024-06-06T04:22:11Z) - Amortizing intractable inference in diffusion models for vision, language, and control [89.65631572949702]
This paper studies amortized sampling of the posterior over data, $mathbfxsim prm post(mathbfx)propto p(mathbfx)r(mathbfx)$, in a model that consists of a diffusion generative model prior $p(mathbfx)$ and a black-box constraint or function $r(mathbfx)$.
We prove the correctness of a data-free learning objective, relative trajectory balance, for training a diffusion model that samples from
arXiv Detail & Related papers (2024-05-31T16:18:46Z) - Statistically Optimal Generative Modeling with Maximum Deviation from the Empirical Distribution [2.1146241717926664]
We show that the Wasserstein GAN, constrained to left-invertible push-forward maps, generates distributions that avoid replication and significantly deviate from the empirical distribution.
Our most important contribution provides a finite-sample lower bound on the Wasserstein-1 distance between the generative distribution and the empirical one.
We also establish a finite-sample upper bound on the distance between the generative distribution and the true data-generating one.
arXiv Detail & Related papers (2023-07-31T06:11:57Z) - Diffusion Models are Minimax Optimal Distribution Estimators [49.47503258639454]
We provide the first rigorous analysis on approximation and generalization abilities of diffusion modeling.
We show that when the true density function belongs to the Besov space and the empirical score matching loss is properly minimized, the generated data distribution achieves the nearly minimax optimal estimation rates.
arXiv Detail & Related papers (2023-03-03T11:31:55Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Score Approximation, Estimation and Distribution Recovery of Diffusion
Models on Low-Dimensional Data [68.62134204367668]
This paper studies score approximation, estimation, and distribution recovery of diffusion models, when data are supported on an unknown low-dimensional linear subspace.
We show that with a properly chosen neural network architecture, the score function can be both accurately approximated and efficiently estimated.
The generated distribution based on the estimated score function captures the data geometric structures and converges to a close vicinity of the data distribution.
arXiv Detail & Related papers (2023-02-14T17:02:35Z) - Flexible Amortized Variational Inference in qBOLD MRI [56.4324135502282]
Oxygen extraction fraction (OEF) and deoxygenated blood volume (DBV) are more ambiguously determined from the data.
Existing inference methods tend to yield very noisy and underestimated OEF maps, while overestimating DBV.
This work describes a novel probabilistic machine learning approach that can infer plausible distributions of OEF and DBV.
arXiv Detail & Related papers (2022-03-11T10:47:16Z) - Improving Nonparametric Density Estimation with Tensor Decompositions [14.917420021212912]
Nonparametric density estimators often perform well on low dimensional data, but suffer when applied to higher dimensional data.
This paper investigates whether these improvements can be extended to other simplified dependence assumptions.
We prove that restricting estimation to low-rank nonnegative PARAFAC or Tucker decompositions removes the dimensionality exponent on bin width rates for multidimensional histograms.
arXiv Detail & Related papers (2020-10-06T01:39:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.