Score Approximation, Estimation and Distribution Recovery of Diffusion
Models on Low-Dimensional Data
- URL: http://arxiv.org/abs/2302.07194v1
- Date: Tue, 14 Feb 2023 17:02:35 GMT
- Title: Score Approximation, Estimation and Distribution Recovery of Diffusion
Models on Low-Dimensional Data
- Authors: Minshuo Chen, Kaixuan Huang, Tuo Zhao, Mengdi Wang
- Abstract summary: This paper studies score approximation, estimation, and distribution recovery of diffusion models, when data are supported on an unknown low-dimensional linear subspace.
We show that with a properly chosen neural network architecture, the score function can be both accurately approximated and efficiently estimated.
The generated distribution based on the estimated score function captures the data geometric structures and converges to a close vicinity of the data distribution.
- Score: 68.62134204367668
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Diffusion models achieve state-of-the-art performance in various generation
tasks. However, their theoretical foundations fall far behind. This paper
studies score approximation, estimation, and distribution recovery of diffusion
models, when data are supported on an unknown low-dimensional linear subspace.
Our result provides sample complexity bounds for distribution estimation using
diffusion models. We show that with a properly chosen neural network
architecture, the score function can be both accurately approximated and
efficiently estimated. Furthermore, the generated distribution based on the
estimated score function captures the data geometric structures and converges
to a close vicinity of the data distribution. The convergence rate depends on
the subspace dimension, indicating that diffusion models can circumvent the
curse of data ambient dimensionality.
Related papers
- Your Absorbing Discrete Diffusion Secretly Models the Conditional Distributions of Clean Data [55.54827581105283]
We show that the concrete score in absorbing diffusion can be expressed as conditional probabilities of clean data.
We propose a dedicated diffusion model without time-condition that characterizes the time-independent conditional probabilities.
Our models achieve SOTA performance among diffusion models on 5 zero-shot language modeling benchmarks.
arXiv Detail & Related papers (2024-06-06T04:22:11Z) - Amortizing intractable inference in diffusion models for vision, language, and control [89.65631572949702]
This paper studies amortized sampling of the posterior over data, $mathbfxsim prm post(mathbfx)propto p(mathbfx)r(mathbfx)$, in a model that consists of a diffusion generative model prior $p(mathbfx)$ and a black-box constraint or function $r(mathbfx)$.
We prove the correctness of a data-free learning objective, relative trajectory balance, for training a diffusion model that samples from
arXiv Detail & Related papers (2024-05-31T16:18:46Z) - Unveil Conditional Diffusion Models with Classifier-free Guidance: A Sharp Statistical Theory [87.00653989457834]
Conditional diffusion models serve as the foundation of modern image synthesis and find extensive application in fields like computational biology and reinforcement learning.
Despite the empirical success, theory of conditional diffusion models is largely missing.
This paper bridges the gap by presenting a sharp statistical theory of distribution estimation using conditional diffusion models.
arXiv Detail & Related papers (2024-03-18T17:08:24Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Analyzing Neural Network-Based Generative Diffusion Models through Convex Optimization [45.72323731094864]
We present a theoretical framework to analyze two-layer neural network-based diffusion models.
We prove that training shallow neural networks for score prediction can be done by solving a single convex program.
Our results provide a precise characterization of what neural network-based diffusion models learn in non-asymptotic settings.
arXiv Detail & Related papers (2024-02-03T00:20:25Z) - Diffusion Models are Minimax Optimal Distribution Estimators [49.47503258639454]
We provide the first rigorous analysis on approximation and generalization abilities of diffusion modeling.
We show that when the true density function belongs to the Besov space and the empirical score matching loss is properly minimized, the generated data distribution achieves the nearly minimax optimal estimation rates.
arXiv Detail & Related papers (2023-03-03T11:31:55Z) - Infinite-Dimensional Diffusion Models [4.342241136871849]
We formulate diffusion-based generative models in infinite dimensions and apply them to the generative modeling of functions.
We show that our formulations are well posed in the infinite-dimensional setting and provide dimension-independent distance bounds from the sample to the target measure.
We also develop guidelines for the design of infinite-dimensional diffusion models.
arXiv Detail & Related papers (2023-02-20T18:00:38Z) - Your diffusion model secretly knows the dimension of the data manifold [0.0]
A diffusion model approximates the gradient of the log density of a noise-corrupted version of the target distribution for varying levels of corruption.
We prove that, if the data concentrates around a manifold embedded in the high-dimensional ambient space, then as the level of corruption decreases, the score function points towards the manifold.
arXiv Detail & Related papers (2022-12-23T23:15:14Z) - A likelihood approach to nonparametric estimation of a singular
distribution using deep generative models [4.329951775163721]
We investigate a likelihood approach to nonparametric estimation of a singular distribution using deep generative models.
We prove that a novel and effective solution exists by perturbing the data with an instance noise.
We also characterize the class of distributions that can be efficiently estimated via deep generative models.
arXiv Detail & Related papers (2021-05-09T23:13:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.