Flow Matching on General Geometries
- URL: http://arxiv.org/abs/2302.03660v3
- Date: Mon, 26 Feb 2024 17:52:00 GMT
- Title: Flow Matching on General Geometries
- Authors: Ricky T. Q. Chen, Yaron Lipman
- Abstract summary: We propose a simple yet powerful framework for training continuous normalizing flows on manifold geometries.
We show that it is simulation-free on simple geometries, does not require divergence, and computes its target vector field in closed-form.
Our method achieves state-of-the-art performance on many real-world non-Euclidean datasets.
- Score: 43.252817099263744
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose Riemannian Flow Matching (RFM), a simple yet powerful framework
for training continuous normalizing flows on manifolds. Existing methods for
generative modeling on manifolds either require expensive simulation, are
inherently unable to scale to high dimensions, or use approximations for
limiting quantities that result in biased training objectives. Riemannian Flow
Matching bypasses these limitations and offers several advantages over previous
approaches: it is simulation-free on simple geometries, does not require
divergence computation, and computes its target vector field in closed-form.
The key ingredient behind RFM is the construction of a relatively simple
premetric for defining target vector fields, which encompasses the existing
Euclidean case. To extend to general geometries, we rely on the use of spectral
decompositions to efficiently compute premetrics on the fly. Our method
achieves state-of-the-art performance on many real-world non-Euclidean
datasets, and we demonstrate tractable training on general geometries,
including triangular meshes with highly non-trivial curvature and boundaries.
Related papers
- Score-based pullback Riemannian geometry [10.649159213723106]
We propose a framework for data-driven Riemannian geometry that is scalable in both geometry and learning.
We produce high-quality geodesics through the data support and reliably estimates the intrinsic dimension of the data manifold.
Our framework can naturally be used with anisotropic normalizing flows by adopting isometry regularization during training.
arXiv Detail & Related papers (2024-10-02T18:52:12Z) - Metric Flow Matching for Smooth Interpolations on the Data Manifold [40.24392451848883]
Metric Flow Matching (MFM) is a novel simulation-free framework for conditional flow matching.
We propose MFM as a framework for conditional paths that transform a source distribution into a target distribution.
We test MFM on a suite of challenges including LiDAR navigation, unpaired image translation, and modeling cellular dynamics.
arXiv Detail & Related papers (2024-05-23T16:48:06Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - The Dynamics of Riemannian Robbins-Monro Algorithms [101.29301565229265]
We propose a family of Riemannian algorithms generalizing and extending the seminal approximation framework of Robbins and Monro.
Compared to their Euclidean counterparts, Riemannian algorithms are much less understood due to lack of a global linear structure on the manifold.
We provide a general template of almost sure convergence results that mirrors and extends the existing theory for Euclidean Robbins-Monro schemes.
arXiv Detail & Related papers (2022-06-14T12:30:11Z) - Optimization on manifolds: A symplectic approach [127.54402681305629]
We propose a dissipative extension of Dirac's theory of constrained Hamiltonian systems as a general framework for solving optimization problems.
Our class of (accelerated) algorithms are not only simple and efficient but also applicable to a broad range of contexts.
arXiv Detail & Related papers (2021-07-23T13:43:34Z) - Continuous normalizing flows on manifolds [0.342658286826597]
We describe how the recently introduced Neural ODEs and continuous normalizing flows can be extended to arbitrary smooth manifold.
We propose a general methodology for parameterizing vector fields on these spaces and demonstrate how gradient-based learning can be performed.
arXiv Detail & Related papers (2021-03-14T15:35:19Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z) - Neural Ordinary Differential Equations on Manifolds [0.342658286826597]
Recently normalizing flows in Euclidean space based on Neural ODEs show great promise, yet suffer the same limitations.
We show how vector fields provide a general framework for parameterizing a flexible class of invertible mapping on these spaces.
arXiv Detail & Related papers (2020-06-11T17:56:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.