Riemannian Consistency Model
- URL: http://arxiv.org/abs/2510.00983v2
- Date: Mon, 03 Nov 2025 05:11:44 GMT
- Title: Riemannian Consistency Model
- Authors: Chaoran Cheng, Yusong Wang, Yuxin Chen, Xiangxin Zhou, Nanning Zheng, Ge Liu,
- Abstract summary: We propose the Riemannian Consistency Model (RCM), which, for the first time, enables few-step consistency modeling.<n>We derive the closed-form solutions for both discrete- and continuous-time training objectives for RCM.<n>We provide a unique kinematics perspective for interpreting the RCM objective, offering new theoretical angles.
- Score: 57.933800575074535
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Consistency models are a class of generative models that enable few-step generation for diffusion and flow matching models. While consistency models have achieved promising results on Euclidean domains like images, their applications to Riemannian manifolds remain challenging due to the curved geometry. In this work, we propose the Riemannian Consistency Model (RCM), which, for the first time, enables few-step consistency modeling while respecting the intrinsic manifold constraint imposed by the Riemannian geometry. Leveraging the covariant derivative and exponential-map-based parameterization, we derive the closed-form solutions for both discrete- and continuous-time training objectives for RCM. We then demonstrate theoretical equivalence between the two variants of RCM: Riemannian consistency distillation (RCD) that relies on a teacher model to approximate the marginal vector field, and Riemannian consistency training (RCT) that utilizes the conditional vector field for training. We further propose a simplified training objective that eliminates the need for the complicated differential calculation. Finally, we provide a unique kinematics perspective for interpreting the RCM objective, offering new theoretical angles. Through extensive experiments, we manifest the superior generative quality of RCM in few-step generation on various non-Euclidean manifolds, including flat-tori, spheres, and the 3D rotation group SO(3).
Related papers
- Riemannian Langevin Dynamics: Strong Convergence of Geometric Euler-Maruyama Scheme [51.56484100374058]
Low-dimensional structure in real-world data plays an important role in the success of generative models.<n>We prove convergence theory of numerical schemes for manifold-valued differential equations.
arXiv Detail & Related papers (2026-03-04T01:29:35Z) - Efficient Diffusion Models for Symmetric Manifolds [25.99200001269046]
We introduce a framework for designing efficient diffusion models for $d$-dimensional symmetric-space.<n>Mandela symmetries ensure the diffusion satisfies an "average-case" Lipschitz condition.<n>Our model outperforms prior methods in training speed and improves sample quality on synthetic datasets.
arXiv Detail & Related papers (2025-05-27T18:12:29Z) - Riemannian Denoising Diffusion Probabilistic Models [7.964790563398277]
We propose RDDPMs for learning distributions on submanifolds of Euclidean space that are level sets of functions.<n>We provide a theoretical analysis of our method in the continuous-time limit.<n>The capability of our method is demonstrated on datasets from previous studies and on new sampled datasets.
arXiv Detail & Related papers (2025-05-07T11:37:16Z) - Riemannian Neural Geodesic Interpolant [15.653104625330062]
Differential interpolants are efficient generative models that bridge two arbitrary probability density functions in finite time.<n>These models are primarily developed in Euclidean space, and are therefore limited in their application to many distribution learning problems.<n>We introduce the Riemannian Geodesic Interpolant (RNGI) model, which interpolates between two probability densities.
arXiv Detail & Related papers (2025-04-22T09:28:29Z) - Geometric Trajectory Diffusion Models [58.853975433383326]
Generative models have shown great promise in generating 3D geometric systems.
Existing approaches only operate on static structures, neglecting the fact that physical systems are always dynamic in nature.
We propose geometric trajectory diffusion models (GeoTDM), the first diffusion model for modeling the temporal distribution of 3D geometric trajectories.
arXiv Detail & Related papers (2024-10-16T20:36:41Z) - Generative Modeling on Manifolds Through Mixture of Riemannian Diffusion Processes [57.396578974401734]
We introduce a principled framework for building a generative diffusion process on general manifold.
Instead of following the denoising approach of previous diffusion models, we construct a diffusion process using a mixture of bridge processes.
We develop a geometric understanding of the mixture process, deriving the drift as a weighted mean of tangent directions to the data points.
arXiv Detail & Related papers (2023-10-11T06:04:40Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - The Dynamics of Riemannian Robbins-Monro Algorithms [101.29301565229265]
We propose a family of Riemannian algorithms generalizing and extending the seminal approximation framework of Robbins and Monro.
Compared to their Euclidean counterparts, Riemannian algorithms are much less understood due to lack of a global linear structure on the manifold.
We provide a general template of almost sure convergence results that mirrors and extends the existing theory for Euclidean Robbins-Monro schemes.
arXiv Detail & Related papers (2022-06-14T12:30:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.