Generative Modeling on Manifolds Through Mixture of Riemannian Diffusion Processes
- URL: http://arxiv.org/abs/2310.07216v2
- Date: Sun, 2 Jun 2024 18:31:06 GMT
- Title: Generative Modeling on Manifolds Through Mixture of Riemannian Diffusion Processes
- Authors: Jaehyeong Jo, Sung Ju Hwang,
- Abstract summary: We introduce a principled framework for building a generative diffusion process on general manifold.
Instead of following the denoising approach of previous diffusion models, we construct a diffusion process using a mixture of bridge processes.
We develop a geometric understanding of the mixture process, deriving the drift as a weighted mean of tangent directions to the data points.
- Score: 57.396578974401734
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning the distribution of data on Riemannian manifolds is crucial for modeling data from non-Euclidean space, which is required by many applications in diverse scientific fields. Yet, existing generative models on manifolds suffer from expensive divergence computation or rely on approximations of heat kernel. These limitations restrict their applicability to simple geometries and hinder scalability to high dimensions. In this work, we introduce the Riemannian Diffusion Mixture, a principled framework for building a generative diffusion process on manifolds. Instead of following the denoising approach of previous diffusion models, we construct a diffusion process using a mixture of bridge processes derived on general manifolds without requiring heat kernel estimations. We develop a geometric understanding of the mixture process, deriving the drift as a weighted mean of tangent directions to the data points that guides the process toward the data distribution. We further propose a scalable training objective for learning the mixture process that readily applies to general manifolds. Our method achieves superior performance on diverse manifolds with dramatically reduced number of in-training simulation steps for general manifolds.
Related papers
- Distillation of Discrete Diffusion through Dimensional Correlations [21.078500510691747]
"Mixture" models in discrete diffusion are capable of treating dimensional correlations while remaining scalable.
We empirically demonstrate that our proposed method for discrete diffusions work in practice, by distilling a continuous-time discrete diffusion model pretrained on the CIFAR-10 dataset.
arXiv Detail & Related papers (2024-10-11T10:53:03Z) - Score matching for sub-Riemannian bridge sampling [2.048226951354646]
Recent progress in machine learning can be modified to allow training of score approximators on sub-Riemannian gradients.
We perform numerical experiments exemplifying samples from the bridge process on the Heisenberg group and the concentration of this process for small time.
arXiv Detail & Related papers (2024-04-23T17:45:53Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Eliminating Lipschitz Singularities in Diffusion Models [51.806899946775076]
We show that diffusion models frequently exhibit the infinite Lipschitz near the zero point of timesteps.
This poses a threat to the stability and accuracy of the diffusion process, which relies on integral operations.
We propose a novel approach, dubbed E-TSDM, which eliminates the Lipschitz of the diffusion model near zero.
arXiv Detail & Related papers (2023-06-20T03:05:28Z) - A Heat Diffusion Perspective on Geodesic Preserving Dimensionality
Reduction [66.21060114843202]
We propose a more general heat kernel based manifold embedding method that we call heat geodesic embeddings.
Results show that our method outperforms existing state of the art in preserving ground truth manifold distances.
We also showcase our method on single cell RNA-sequencing datasets with both continuum and cluster structure.
arXiv Detail & Related papers (2023-05-30T13:58:50Z) - Riemannian Diffusion Models [11.306081315276089]
Diffusion models are recent state-of-the-art methods for image generation and likelihood estimation.
In this work, we generalize continuous-time diffusion models to arbitrary Riemannian manifold.
Our proposed method achieves new state-of-the-art likelihoods on all benchmarks.
arXiv Detail & Related papers (2022-08-16T21:18:31Z) - Riemannian Score-Based Generative Modeling [56.20669989459281]
We introduce score-based generative models (SGMs) demonstrating remarkable empirical performance.
Current SGMs make the underlying assumption that the data is supported on a Euclidean manifold with flat geometry.
This prevents the use of these models for applications in robotics, geoscience or protein modeling.
arXiv Detail & Related papers (2022-02-06T11:57:39Z) - Learning Manifold Implicitly via Explicit Heat-Kernel Learning [63.354671267760516]
We propose the concept of implicit manifold learning, where manifold information is implicitly obtained by learning the associated heat kernel.
The learned heat kernel can be applied to various kernel-based machine learning models, including deep generative models (DGM) for data generation and Stein Variational Gradient Descent for Bayesian inference.
arXiv Detail & Related papers (2020-10-05T03:39:58Z) - Intrinsic Gaussian Processes on Manifolds and Their Accelerations by
Symmetry [9.773237080061815]
Existing methods primarily focus on low dimensional constrained domains for heat kernel estimation.
Our research proposes an intrinsic approach for constructing GP on general equations.
Our methodology estimates the heat kernel by simulating Brownian motion sample paths using the exponential map.
arXiv Detail & Related papers (2020-06-25T09:17:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.