Scaling Riemannian Diffusion Models
- URL: http://arxiv.org/abs/2310.20030v1
- Date: Mon, 30 Oct 2023 21:27:53 GMT
- Title: Scaling Riemannian Diffusion Models
- Authors: Aaron Lou, Minkai Xu, Stefano Ermon
- Abstract summary: We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
- Score: 68.52820280448991
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Riemannian diffusion models draw inspiration from standard Euclidean space
diffusion models to learn distributions on general manifolds. Unfortunately,
the additional geometric complexity renders the diffusion transition term
inexpressible in closed form, so prior methods resort to imprecise
approximations of the score matching training objective that degrade
performance and preclude applications in high dimensions. In this work, we
reexamine these approximations and propose several practical improvements. Our
key observation is that most relevant manifolds are symmetric spaces, which are
much more amenable to computation. By leveraging and combining various
ans\"{a}tze, we can quickly compute relevant quantities to high precision. On
low dimensional datasets, our correction produces a noticeable improvement,
allowing diffusion to compete with other methods. Additionally, we show that
our method enables us to scale to high dimensional tasks on nontrivial
manifolds. In particular, we model QCD densities on $SU(n)$ lattices and
contrastively learned embeddings on high dimensional hyperspheres.
Related papers
- On Probabilistic Pullback Metrics on Latent Hyperbolic Manifolds [5.724027955589408]
This paper focuses on the hyperbolic manifold, a particularly suitable choice for modeling hierarchical relationships.
We propose augmenting the hyperbolic metric with a pullback metric to account for distortions introduced by theVM's nonlinear mapping.
Through various experiments, we demonstrate that geodesics on the pullback metric not only respect the geometry of the hyperbolic latent space but also align with the underlying data distribution.
arXiv Detail & Related papers (2024-10-28T09:13:00Z) - Implicit Manifold Gaussian Process Regression [49.0787777751317]
Gaussian process regression is widely used to provide well-calibrated uncertainty estimates.
It struggles with high-dimensional data because of the implicit low-dimensional manifold upon which the data actually lies.
In this paper we propose a technique capable of inferring implicit structure directly from data (labeled and unlabeled) in a fully differentiable way.
arXiv Detail & Related papers (2023-10-30T09:52:48Z) - Generative Modeling on Manifolds Through Mixture of Riemannian Diffusion Processes [57.396578974401734]
We introduce a principled framework for building a generative diffusion process on general manifold.
Instead of following the denoising approach of previous diffusion models, we construct a diffusion process using a mixture of bridge processes.
We develop a geometric understanding of the mixture process, deriving the drift as a weighted mean of tangent directions to the data points.
arXiv Detail & Related papers (2023-10-11T06:04:40Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Decomposed Diffusion Sampler for Accelerating Large-Scale Inverse
Problems [64.29491112653905]
We propose a novel and efficient diffusion sampling strategy that synergistically combines the diffusion sampling and Krylov subspace methods.
Specifically, we prove that if tangent space at a denoised sample by Tweedie's formula forms a Krylov subspace, then the CG with the denoised data ensures the data consistency update to remain in the tangent space.
Our proposed method achieves more than 80 times faster inference time than the previous state-of-the-art method.
arXiv Detail & Related papers (2023-03-10T07:42:49Z) - Riemannian Diffusion Models [11.306081315276089]
Diffusion models are recent state-of-the-art methods for image generation and likelihood estimation.
In this work, we generalize continuous-time diffusion models to arbitrary Riemannian manifold.
Our proposed method achieves new state-of-the-art likelihoods on all benchmarks.
arXiv Detail & Related papers (2022-08-16T21:18:31Z) - Diagnosing and Fixing Manifold Overfitting in Deep Generative Models [11.82509693248749]
Likelihood-based, or explicit, deep generative models use neural networks to construct flexible high-dimensional densities.
We show that observed data lies on a low-dimensional manifold embedded in high-dimensional ambient space.
We propose a class of two-step procedures consisting of a dimensionality reduction step followed by maximum-likelihood density estimation.
arXiv Detail & Related papers (2022-04-14T18:00:03Z) - Riemannian Score-Based Generative Modeling [56.20669989459281]
We introduce score-based generative models (SGMs) demonstrating remarkable empirical performance.
Current SGMs make the underlying assumption that the data is supported on a Euclidean manifold with flat geometry.
This prevents the use of these models for applications in robotics, geoscience or protein modeling.
arXiv Detail & Related papers (2022-02-06T11:57:39Z) - Rectangular Flows for Manifold Learning [38.63646804834534]
Normalizing flows are invertible neural networks with tractable change-of-volume terms.
Data of interest is typically assumed to live in some (often unknown) low-dimensional manifold embedded in high-dimensional ambient space.
We propose two methods to tractably the gradient of this term with respect to the parameters of the model.
arXiv Detail & Related papers (2021-06-02T18:30:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.