Neural Ordinary Differential Equations on Manifolds
- URL: http://arxiv.org/abs/2006.06663v1
- Date: Thu, 11 Jun 2020 17:56:34 GMT
- Title: Neural Ordinary Differential Equations on Manifolds
- Authors: Luca Falorsi and Patrick Forr\'e
- Abstract summary: Recently normalizing flows in Euclidean space based on Neural ODEs show great promise, yet suffer the same limitations.
We show how vector fields provide a general framework for parameterizing a flexible class of invertible mapping on these spaces.
- Score: 0.342658286826597
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Normalizing flows are a powerful technique for obtaining reparameterizable
samples from complex multimodal distributions. Unfortunately current approaches
fall short when the underlying space has a non trivial topology, and are only
available for the most basic geometries. Recently normalizing flows in
Euclidean space based on Neural ODEs show great promise, yet suffer the same
limitations. Using ideas from differential geometry and geometric control
theory, we describe how neural ODEs can be extended to smooth manifolds. We
show how vector fields provide a general framework for parameterizing a
flexible class of invertible mapping on these spaces and we illustrate how
gradient based learning can be performed. As a result we define a general
methodology for building normalizing flows on manifolds.
Related papers
- Topological Obstructions and How to Avoid Them [22.45861345237023]
We show that local optima can arise due to singularities or an incorrect degree or winding number.
We propose a new flow-based model that maps data points to multimodal distributions over geometric spaces.
arXiv Detail & Related papers (2023-12-12T18:56:14Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Decomposed Diffusion Sampler for Accelerating Large-Scale Inverse
Problems [64.29491112653905]
We propose a novel and efficient diffusion sampling strategy that synergistically combines the diffusion sampling and Krylov subspace methods.
Specifically, we prove that if tangent space at a denoised sample by Tweedie's formula forms a Krylov subspace, then the CG with the denoised data ensures the data consistency update to remain in the tangent space.
Our proposed method achieves more than 80 times faster inference time than the previous state-of-the-art method.
arXiv Detail & Related papers (2023-03-10T07:42:49Z) - Flow Matching on General Geometries [43.252817099263744]
We propose a simple yet powerful framework for training continuous normalizing flows on manifold geometries.
We show that it is simulation-free on simple geometries, does not require divergence, and computes its target vector field in closed-form.
Our method achieves state-of-the-art performance on many real-world non-Euclidean datasets.
arXiv Detail & Related papers (2023-02-07T18:21:24Z) - Deep Learning Approximation of Diffeomorphisms via Linear-Control
Systems [91.3755431537592]
We consider a control system of the form $dot x = sum_i=1lF_i(x)u_i$, with linear dependence in the controls.
We use the corresponding flow to approximate the action of a diffeomorphism on a compact ensemble of points.
arXiv Detail & Related papers (2021-10-24T08:57:46Z) - Semi-Riemannian Graph Convolutional Networks [36.09315878397234]
We develop a principled Semi-Riemannian GCN that first models data in semi-Riemannian manifold of constant nonzero curvature.
Our method provides a geometric inductive bias that is sufficiently flexible to model mixed heterogeneous topologies like hierarchical graphs with cycles.
arXiv Detail & Related papers (2021-06-06T14:23:34Z) - Continuous normalizing flows on manifolds [0.342658286826597]
We describe how the recently introduced Neural ODEs and continuous normalizing flows can be extended to arbitrary smooth manifold.
We propose a general methodology for parameterizing vector fields on these spaces and demonstrate how gradient-based learning can be performed.
arXiv Detail & Related papers (2021-03-14T15:35:19Z) - ResNet-LDDMM: Advancing the LDDMM Framework Using Deep Residual Networks [86.37110868126548]
In this work, we make use of deep residual neural networks to solve the non-stationary ODE (flow equation) based on a Euler's discretization scheme.
We illustrate these ideas on diverse registration problems of 3D shapes under complex topology-preserving transformations.
arXiv Detail & Related papers (2021-02-16T04:07:13Z) - Neural Manifold Ordinary Differential Equations [46.25832801867149]
We introduce Neural Manifold Ordinary Differential Equations, which enables the construction of Manifold Continuous Normalizing Flows (MCNFs)
MCNFs require only local geometry and compute probabilities with continuous change of variables.
We find that leveraging continuous manifold dynamics produces a marked improvement for both density estimation and downstream tasks.
arXiv Detail & Related papers (2020-06-18T03:24:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.