Diffusion Probabilistic Fields
- URL: http://arxiv.org/abs/2303.00165v1
- Date: Wed, 1 Mar 2023 01:37:24 GMT
- Title: Diffusion Probabilistic Fields
- Authors: Peiye Zhuang, Samira Abnar, Jiatao Gu, Alex Schwing, Joshua M.
Susskind, Miguel \'Angel Bautista
- Abstract summary: We introduce Diffusion Probabilistic Fields (DPF), a diffusion model that can learn distributions over continuous functions defined over metric spaces.
We empirically show that DPF effectively deals with different modalities like 2D images and 3D geometry, in addition to modeling distributions over fields defined on non-Euclidean metric spaces.
- Score: 42.428882785136295
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Diffusion probabilistic models have quickly become a major approach for
generative modeling of images, 3D geometry, video and other domains. However,
to adapt diffusion generative modeling to these domains the denoising network
needs to be carefully designed for each domain independently, oftentimes under
the assumption that data lives in a Euclidean grid. In this paper we introduce
Diffusion Probabilistic Fields (DPF), a diffusion model that can learn
distributions over continuous functions defined over metric spaces, commonly
known as fields. We extend the formulation of diffusion probabilistic models to
deal with this field parametrization in an explicit way, enabling us to define
an end-to-end learning algorithm that side-steps the requirement of
representing fields with latent vectors as in previous approaches (Dupont et
al., 2022a; Du et al., 2021). We empirically show that, while using the same
denoising network, DPF effectively deals with different modalities like 2D
images and 3D geometry, in addition to modeling distributions over fields
defined on non-Euclidean metric spaces.
Related papers
- Non-Denoising Forward-Time Diffusions [4.831663144935879]
We show that the time-reversal argument, common to all denoising diffusion probabilistic modeling proposals, is not necessary.
We obtain diffusion processes targeting the desired data distribution by taking appropriate mixtures of diffusion bridges.
We develop a unifying view of the drift adjustments corresponding to our and to time-reversal approaches.
arXiv Detail & Related papers (2023-12-22T10:26:31Z) - Diffusion models for probabilistic programming [56.47577824219207]
Diffusion Model Variational Inference (DMVI) is a novel method for automated approximate inference in probabilistic programming languages (PPLs)
DMVI is easy to implement, allows hassle-free inference in PPLs without the drawbacks of, e.g., variational inference using normalizing flows, and does not make any constraints on the underlying neural network model.
arXiv Detail & Related papers (2023-11-01T12:17:05Z) - Discovery and Expansion of New Domains within Diffusion Models [41.25905891327446]
We study the generalization properties of diffusion models in a fewshot setup.
We introduce a novel tuning-free paradigm to synthesize the target out-of-domain data.
arXiv Detail & Related papers (2023-10-13T16:07:31Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - T1: Scaling Diffusion Probabilistic Fields to High-Resolution on Unified
Visual Modalities [69.16656086708291]
Diffusion Probabilistic Field (DPF) models the distribution of continuous functions defined over metric spaces.
We propose a new model comprising of a view-wise sampling algorithm to focus on local structure learning.
The model can be scaled to generate high-resolution data while unifying multiple modalities.
arXiv Detail & Related papers (2023-05-24T03:32:03Z) - Infinite-Dimensional Diffusion Models [4.342241136871849]
We formulate diffusion-based generative models in infinite dimensions and apply them to the generative modeling of functions.
We show that our formulations are well posed in the infinite-dimensional setting and provide dimension-independent distance bounds from the sample to the target measure.
We also develop guidelines for the design of infinite-dimensional diffusion models.
arXiv Detail & Related papers (2023-02-20T18:00:38Z) - Score Approximation, Estimation and Distribution Recovery of Diffusion
Models on Low-Dimensional Data [68.62134204367668]
This paper studies score approximation, estimation, and distribution recovery of diffusion models, when data are supported on an unknown low-dimensional linear subspace.
We show that with a properly chosen neural network architecture, the score function can be both accurately approximated and efficiently estimated.
The generated distribution based on the estimated score function captures the data geometric structures and converges to a close vicinity of the data distribution.
arXiv Detail & Related papers (2023-02-14T17:02:35Z) - Let us Build Bridges: Understanding and Extending Diffusion Generative
Models [19.517597928769042]
Diffusion-based generative models have achieved promising results recently, but raise an array of open questions.
This work tries to re-exam the overall framework in order to gain better theoretical understandings.
We present 1) a first theoretical error analysis for learning diffusion generation models, and 2) a simple and unified approach to learning on data from different discrete and constrained domains.
arXiv Detail & Related papers (2022-08-31T08:58:10Z) - Diffusion models as plug-and-play priors [98.16404662526101]
We consider the problem of inferring high-dimensional data $mathbfx$ in a model that consists of a prior $p(mathbfx)$ and an auxiliary constraint $c(mathbfx,mathbfy)$.
The structure of diffusion models allows us to perform approximate inference by iterating differentiation through the fixed denoising network enriched with different amounts of noise.
arXiv Detail & Related papers (2022-06-17T21:11:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.