Trans-Dimensional Generative Modeling via Jump Diffusion Models
- URL: http://arxiv.org/abs/2305.16261v2
- Date: Mon, 30 Oct 2023 10:14:44 GMT
- Title: Trans-Dimensional Generative Modeling via Jump Diffusion Models
- Authors: Andrew Campbell, William Harvey, Christian Weilbach, Valentin De
Bortoli, Tom Rainforth, Arnaud Doucet
- Abstract summary: We propose a new class of generative models that naturally handle data of varying dimensionality.
We first define a dimension destroying forward noising process, before deriving the dimension creating time-reversed generative process.
Simulating our learned approximation to the time-reversed generative process then provides an effective way of sampling data of varying dimensionality.
- Score: 46.183265841345644
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a new class of generative models that naturally handle data of
varying dimensionality by jointly modeling the state and dimension of each
datapoint. The generative process is formulated as a jump diffusion process
that makes jumps between different dimensional spaces. We first define a
dimension destroying forward noising process, before deriving the dimension
creating time-reversed generative process along with a novel evidence lower
bound training objective for learning to approximate it. Simulating our learned
approximation to the time-reversed generative process then provides an
effective way of sampling data of varying dimensionality by jointly generating
state values and dimensions. We demonstrate our approach on molecular and video
datasets of varying dimensionality, reporting better compatibility with
test-time diffusion guidance imputation tasks and improved interpolation
capabilities versus fixed dimensional models that generate state values and
dimensions separately.
Related papers
- Towards Model-Agnostic Dataset Condensation by Heterogeneous Models [13.170099297210372]
We develop a novel method to produce universally applicable condensed images through cross-model interactions.
By balancing the contribution of each model and maintaining their semantic meaning closely, our approach overcomes the limitations associated with model-specific condensed images.
arXiv Detail & Related papers (2024-09-22T17:13:07Z) - Synthetic location trajectory generation using categorical diffusion
models [50.809683239937584]
Diffusion models (DPMs) have rapidly evolved to be one of the predominant generative models for the simulation of synthetic data.
We propose using DPMs for the generation of synthetic individual location trajectories (ILTs) which are sequences of variables representing physical locations visited by individuals.
arXiv Detail & Related papers (2024-02-19T15:57:39Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Private Gradient Estimation is Useful for Generative Modeling [25.777591229903596]
We present a new private generative modeling approach where samples are generated via Hamiltonian dynamics with gradients of the private dataset estimated by a well-trained network.
Our model is able to generate data with a resolution of 256x256.
arXiv Detail & Related papers (2023-05-18T02:51:17Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Multilevel Diffusion: Infinite Dimensional Score-Based Diffusion Models for Image Generation [2.5556910002263984]
Score-based diffusion models (SBDM) have emerged as state-of-the-art approaches for image generation.
This paper develops SBDMs in the infinite-dimensional setting, that is, we model the training data as functions supported on a rectangular domain.
We demonstrate how to overcome two shortcomings of current SBDM approaches in the infinite-dimensional setting.
arXiv Detail & Related papers (2023-03-08T18:10:10Z) - Data-driven low-dimensional dynamic model of Kolmogorov flow [0.0]
Reduced order models (ROMs) that capture flow dynamics are of interest for decreasing computational costs for simulation.
This work presents a data-driven framework for minimal-dimensional models that effectively capture the dynamics and properties of the flow.
We apply this to Kolmogorov flow in a regime consisting of chaotic and intermittent behavior.
arXiv Detail & Related papers (2022-10-29T23:05:39Z) - Learning Neural Generative Dynamics for Molecular Conformation
Generation [89.03173504444415]
We study how to generate molecule conformations (textiti.e., 3D structures) from a molecular graph.
We propose a novel probabilistic framework to generate valid and diverse conformations given a molecular graph.
arXiv Detail & Related papers (2021-02-20T03:17:58Z) - Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows [40.9137348900942]
We propose a novel type of flow driven by a differential deformation of the Wiener process.
As a result, we obtain a rich time series model whose observable process inherits many of the appealing properties of its base process.
arXiv Detail & Related papers (2020-02-24T20:13:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.