Riemannian Convex Potential Maps
- URL: http://arxiv.org/abs/2106.10272v1
- Date: Fri, 18 Jun 2021 17:59:06 GMT
- Title: Riemannian Convex Potential Maps
- Authors: Samuel Cohen, Brandon Amos, Yaron Lipman
- Abstract summary: We propose and study a class of flows that uses convex potentials from Riemannian optimal transport.
We demonstrate that these flows can model standard distributions on spheres, and tori, on synthetic and geological data.
- Score: 28.39224890275125
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Modeling distributions on Riemannian manifolds is a crucial component in
understanding non-Euclidean data that arises, e.g., in physics and geology. The
budding approaches in this space are limited by representational and
computational tradeoffs. We propose and study a class of flows that uses convex
potentials from Riemannian optimal transport. These are universal and can model
distributions on any compact Riemannian manifold without requiring domain
knowledge of the manifold to be integrated into the architecture. We
demonstrate that these flows can model standard distributions on spheres, and
tori, on synthetic and geological data. Our source code is freely available
online at http://github.com/facebookresearch/rcpm
Related papers
- Generalised Flow Maps for Few-Step Generative Modelling on Riemannian Manifolds [32.40675406199536]
Generalised Flow Maps (GFM) is a new class of few-step generative models.<n>We benchmark GFMs against other geometric generative models on a suite of geometric datasets.
arXiv Detail & Related papers (2025-10-24T16:14:31Z) - Busemann Functions in the Wasserstein Space: Existence, Closed-Forms, and Applications to Slicing [13.473701044380938]
Busemann function has recently found much interest in a variety of machine learning problems.<n>We investigate the existence and computation of Busemann functions in Wasserstein space.
arXiv Detail & Related papers (2025-10-06T08:31:14Z) - Riemannian Consistency Model [57.933800575074535]
We propose the Riemannian Consistency Model (RCM), which, for the first time, enables few-step consistency modeling.<n>We derive the closed-form solutions for both discrete- and continuous-time training objectives for RCM.<n>We provide a unique kinematics perspective for interpreting the RCM objective, offering new theoretical angles.
arXiv Detail & Related papers (2025-10-01T14:57:25Z) - Riemannian generative decoder [11.074080383657453]
We present a new method for learning representations based on manifold-valued latents.<n>Our method is compatible with existing architectures and yields interpretable latent spaces aligned with data geometry.<n>We validate our approach on three case studies -- a synthetic branching diffusion process, human migrations inferred from mitochondrial DNA, and cells undergoing a cell division cycle.
arXiv Detail & Related papers (2025-06-23T21:06:13Z) - Follow the Energy, Find the Path: Riemannian Metrics from Energy-Based Models [63.331590876872944]
We propose a method for deriving Riemannian metrics directly from pretrained Energy-Based Models.<n>These metrics define spatially varying distances, enabling the computation of geodesics.<n>We show that EBM-derived metrics consistently outperform established baselines.
arXiv Detail & Related papers (2025-05-23T12:18:08Z) - Riemannian Denoising Diffusion Probabilistic Models [7.964790563398277]
We propose RDDPMs for learning distributions on submanifolds of Euclidean space that are level sets of functions.<n>We provide a theoretical analysis of our method in the continuous-time limit.<n>The capability of our method is demonstrated on datasets from previous studies and on new sampled datasets.
arXiv Detail & Related papers (2025-05-07T11:37:16Z) - Riemann$^2$: Learning Riemannian Submanifolds from Riemannian Data [12.424539896723603]
Latent variable models are powerful tools for learning low-dimensional manifold from high-dimensional data.
This paper generalizes previous work and allows us to handle complex tasks in various domains, including robot motion synthesis and analysis of brain connectomes.
arXiv Detail & Related papers (2025-03-07T16:08:53Z) - Wasserstein Flow Matching: Generative modeling over families of distributions [13.620905707751747]
We show how to perform generative modeling over Gaussian distributions, where we generate representations of granular cell states from single-cell genomics data.
We also show that WFM can learn flows between high-dimensional and variable sized point-clouds and synthesize cellular microenvironments from spatial transcriptomics datasets.
arXiv Detail & Related papers (2024-11-01T15:55:07Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Generative Modeling on Manifolds Through Mixture of Riemannian Diffusion Processes [57.396578974401734]
We introduce a principled framework for building a generative diffusion process on general manifold.
Instead of following the denoising approach of previous diffusion models, we construct a diffusion process using a mixture of bridge processes.
We develop a geometric understanding of the mixture process, deriving the drift as a weighted mean of tangent directions to the data points.
arXiv Detail & Related papers (2023-10-11T06:04:40Z) - Manifold-augmented Eikonal Equations: Geodesic Distances and Flows on
Differentiable Manifolds [5.0401589279256065]
We show how the geometry of a manifold impacts the distance field, and exploit the geodesic flow to obtain globally length-minimising curves directly.
This work opens opportunities for statistics and reduced-order modelling on differentiable manifold.
arXiv Detail & Related papers (2023-10-09T21:11:13Z) - Riemannian Diffusion Schr\"odinger Bridge [56.20669989459281]
We introduce emphRiemannian Diffusion Schr"odinger Bridge to accelerate sampling of diffusion models.
We validate our proposed method on synthetic data and real Earth and climate data.
arXiv Detail & Related papers (2022-07-07T00:35:04Z) - Riemannian Score-Based Generative Modeling [56.20669989459281]
We introduce score-based generative models (SGMs) demonstrating remarkable empirical performance.
Current SGMs make the underlying assumption that the data is supported on a Euclidean manifold with flat geometry.
This prevents the use of these models for applications in robotics, geoscience or protein modeling.
arXiv Detail & Related papers (2022-02-06T11:57:39Z) - Implicit Riemannian Concave Potential Maps [2.8137865669570297]
This work combines ideas from implicit neural layers and optimal transport theory to propose a generalisation of existing work on exponential map flows.
IRCPMs have some nice properties such as simplicity of incorporating symmetries and are less expensive than ODE-flows.
We provide an initial theoretical analysis of its properties and layout sufficient conditions for stable optimisation.
arXiv Detail & Related papers (2021-10-04T09:53:20Z) - Semi-Riemannian Graph Convolutional Networks [36.09315878397234]
We develop a principled Semi-Riemannian GCN that first models data in semi-Riemannian manifold of constant nonzero curvature.
Our method provides a geometric inductive bias that is sufficiently flexible to model mixed heterogeneous topologies like hierarchical graphs with cycles.
arXiv Detail & Related papers (2021-06-06T14:23:34Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.