Generalised Flow Maps for Few-Step Generative Modelling on Riemannian Manifolds
- URL: http://arxiv.org/abs/2510.21608v1
- Date: Fri, 24 Oct 2025 16:14:31 GMT
- Title: Generalised Flow Maps for Few-Step Generative Modelling on Riemannian Manifolds
- Authors: Oscar Davis, Michael S. Albergo, Nicholas M. Boffi, Michael M. Bronstein, Avishek Joey Bose,
- Abstract summary: Generalised Flow Maps (GFM) is a new class of few-step generative models.<n>We benchmark GFMs against other geometric generative models on a suite of geometric datasets.
- Score: 32.40675406199536
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Geometric data and purpose-built generative models on them have become ubiquitous in high-impact deep learning application domains, ranging from protein backbone generation and computational chemistry to geospatial data. Current geometric generative models remain computationally expensive at inference -- requiring many steps of complex numerical simulation -- as they are derived from dynamical measure transport frameworks such as diffusion and flow-matching on Riemannian manifolds. In this paper, we propose Generalised Flow Maps (GFM), a new class of few-step generative models that generalises the Flow Map framework in Euclidean spaces to arbitrary Riemannian manifolds. We instantiate GFMs with three self-distillation-based training methods: Generalised Lagrangian Flow Maps, Generalised Eulerian Flow Maps, and Generalised Progressive Flow Maps. We theoretically show that GFMs, under specific design decisions, unify and elevate existing Euclidean few-step generative models, such as consistency models, shortcut models, and meanflows, to the Riemannian setting. We benchmark GFMs against other geometric generative models on a suite of geometric datasets, including geospatial data, RNA torsion angles, and hyperbolic manifolds, and achieve state-of-the-art sample quality for single- and few-step evaluations, and superior or competitive log-likelihoods using the implicit probability flow.
Related papers
- Riemannian Langevin Dynamics: Strong Convergence of Geometric Euler-Maruyama Scheme [51.56484100374058]
Low-dimensional structure in real-world data plays an important role in the success of generative models.<n>We prove convergence theory of numerical schemes for manifold-valued differential equations.
arXiv Detail & Related papers (2026-03-04T01:29:35Z) - Carré du champ flow matching: better quality-generalisation tradeoff in generative models [24.078205139029546]
Carr'e du champ flow matching (CDC-FM) is a generalisation of flow matching (FM)<n>We show that CDC-FM consistently offers a better quality-generalisation tradeoff.<n>Our work provides a mathematical framework for studying the interplay between data geometry, generalisation and memorisation in generative models.
arXiv Detail & Related papers (2025-10-07T13:41:33Z) - Riemannian Consistency Model [57.933800575074535]
We propose the Riemannian Consistency Model (RCM), which, for the first time, enables few-step consistency modeling.<n>We derive the closed-form solutions for both discrete- and continuous-time training objectives for RCM.<n>We provide a unique kinematics perspective for interpreting the RCM objective, offering new theoretical angles.
arXiv Detail & Related papers (2025-10-01T14:57:25Z) - Riemannian Denoising Diffusion Probabilistic Models [7.964790563398277]
We propose RDDPMs for learning distributions on submanifolds of Euclidean space that are level sets of functions.<n>We provide a theoretical analysis of our method in the continuous-time limit.<n>The capability of our method is demonstrated on datasets from previous studies and on new sampled datasets.
arXiv Detail & Related papers (2025-05-07T11:37:16Z) - Riemannian Variational Flow Matching for Material and Protein Design [37.328940532069424]
In Euclidean space, predicting endpoints (VFM), velocities (FM), or noise (diffusion) are largely equivalent due to affines.<n>On curved manifold, this equivalence breaks down, and we hypothesize that endpoint prediction provides a stronger learning signal.<n>Building on this insight, we derive a variational flow matching objective.<n> Experiments on synthetic spherical and hyperbolic benchmarks, as well as real-world tasks in material and protein generation, demonstrate that RG-VFM more effectively captures manifold structure.
arXiv Detail & Related papers (2025-02-18T16:02:10Z) - Sigma Flows for Image and Data Labeling and Learning Structured Prediction [5.875121114945721]
This paper introduces the sigma flow model for the prediction of structured labelings of data observed on Riemannian manifold.<n>The approach combines the Laplace-Beltrami framework for image denoising and enhancement, introduced by Sochen, Kimmel and Malladi about 25 years ago, and the assignment flow approach introduced and studied by the authors.
arXiv Detail & Related papers (2024-08-28T17:04:56Z) - Categorical Flow Matching on Statistical Manifolds [12.646272756981672]
We introduce a flow-matching framework on the manifold of parameterized probability measures inspired by information geometry.<n>We develop an efficient training and sampling algorithm that overcomes numerical stability with a diffeomorphism between manifold.<n>We manifest that SFM can learn more complex patterns on the statistical manifold where existing models often fail due to strong prior assumptions.
arXiv Detail & Related papers (2024-05-26T05:50:39Z) - Generative Modeling on Manifolds Through Mixture of Riemannian Diffusion Processes [57.396578974401734]
We introduce a principled framework for building a generative diffusion process on general manifold.
Instead of following the denoising approach of previous diffusion models, we construct a diffusion process using a mixture of bridge processes.
We develop a geometric understanding of the mixture process, deriving the drift as a weighted mean of tangent directions to the data points.
arXiv Detail & Related papers (2023-10-11T06:04:40Z) - Riemannian Score-Based Generative Modeling [56.20669989459281]
We introduce score-based generative models (SGMs) demonstrating remarkable empirical performance.
Current SGMs make the underlying assumption that the data is supported on a Euclidean manifold with flat geometry.
This prevents the use of these models for applications in robotics, geoscience or protein modeling.
arXiv Detail & Related papers (2022-02-06T11:57:39Z) - Moser Flow: Divergence-based Generative Modeling on Manifolds [49.04974733536027]
Moser Flow (MF) is a new class of generative models within the family of continuous normalizing flows (CNF)
MF does not require invoking or backpropagating through an ODE solver during training.
We demonstrate for the first time the use of flow models for sampling from general curved surfaces.
arXiv Detail & Related papers (2021-08-18T09:00:24Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z) - Neural Manifold Ordinary Differential Equations [46.25832801867149]
We introduce Neural Manifold Ordinary Differential Equations, which enables the construction of Manifold Continuous Normalizing Flows (MCNFs)
MCNFs require only local geometry and compute probabilities with continuous change of variables.
We find that leveraging continuous manifold dynamics produces a marked improvement for both density estimation and downstream tasks.
arXiv Detail & Related papers (2020-06-18T03:24:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.