Neural Manifold Ordinary Differential Equations
- URL: http://arxiv.org/abs/2006.10254v1
- Date: Thu, 18 Jun 2020 03:24:58 GMT
- Title: Neural Manifold Ordinary Differential Equations
- Authors: Aaron Lou, Derek Lim, Isay Katsman, Leo Huang, Qingxuan Jiang, Ser-Nam
Lim, Christopher De Sa
- Abstract summary: We introduce Neural Manifold Ordinary Differential Equations, which enables the construction of Manifold Continuous Normalizing Flows (MCNFs)
MCNFs require only local geometry and compute probabilities with continuous change of variables.
We find that leveraging continuous manifold dynamics produces a marked improvement for both density estimation and downstream tasks.
- Score: 46.25832801867149
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: To better conform to data geometry, recent deep generative modelling
techniques adapt Euclidean constructions to non-Euclidean spaces. In this
paper, we study normalizing flows on manifolds. Previous work has developed
flow models for specific cases; however, these advancements hand craft layers
on a manifold-by-manifold basis, restricting generality and inducing cumbersome
design constraints. We overcome these issues by introducing Neural Manifold
Ordinary Differential Equations, a manifold generalization of Neural ODEs,
which enables the construction of Manifold Continuous Normalizing Flows
(MCNFs). MCNFs require only local geometry (therefore generalizing to arbitrary
manifolds) and compute probabilities with continuous change of variables
(allowing for a simple and expressive flow construction). We find that
leveraging continuous manifold dynamics produces a marked improvement for both
density estimation and downstream tasks.
Related papers
- Topological Obstructions and How to Avoid Them [22.45861345237023]
We show that local optima can arise due to singularities or an incorrect degree or winding number.
We propose a new flow-based model that maps data points to multimodal distributions over geometric spaces.
arXiv Detail & Related papers (2023-12-12T18:56:14Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Generative Modeling on Manifolds Through Mixture of Riemannian Diffusion Processes [57.396578974401734]
We introduce a principled framework for building a generative diffusion process on general manifold.
Instead of following the denoising approach of previous diffusion models, we construct a diffusion process using a mixture of bridge processes.
We develop a geometric understanding of the mixture process, deriving the drift as a weighted mean of tangent directions to the data points.
arXiv Detail & Related papers (2023-10-11T06:04:40Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Nonlinear Isometric Manifold Learning for Injective Normalizing Flows [58.720142291102135]
We use isometries to separate manifold learning and density estimation.
We also employ autoencoders to design embeddings with explicit inverses that do not distort the probability distribution.
arXiv Detail & Related papers (2022-03-08T08:57:43Z) - Equivariant Discrete Normalizing Flows [10.867162810786361]
We focus on building equivariant normalizing flows using discrete layers.
We introduce two new equivariant flows: $G$-coupling Flows and $G$-Residual Flows.
Our construction of $G$-Residual Flows are also universal, in the sense that we prove an $G$-equivariant diffeomorphism can be exactly mapped by a $G$-residual flow.
arXiv Detail & Related papers (2021-10-16T20:16:00Z) - Moser Flow: Divergence-based Generative Modeling on Manifolds [49.04974733536027]
Moser Flow (MF) is a new class of generative models within the family of continuous normalizing flows (CNF)
MF does not require invoking or backpropagating through an ODE solver during training.
We demonstrate for the first time the use of flow models for sampling from general curved surfaces.
arXiv Detail & Related papers (2021-08-18T09:00:24Z) - Continuous normalizing flows on manifolds [0.342658286826597]
We describe how the recently introduced Neural ODEs and continuous normalizing flows can be extended to arbitrary smooth manifold.
We propose a general methodology for parameterizing vector fields on these spaces and demonstrate how gradient-based learning can be performed.
arXiv Detail & Related papers (2021-03-14T15:35:19Z) - Neural Ordinary Differential Equations on Manifolds [0.342658286826597]
Recently normalizing flows in Euclidean space based on Neural ODEs show great promise, yet suffer the same limitations.
We show how vector fields provide a general framework for parameterizing a flexible class of invertible mapping on these spaces.
arXiv Detail & Related papers (2020-06-11T17:56:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.