Riemannian Langevin Dynamics: Strong Convergence of Geometric Euler-Maruyama Scheme
- URL: http://arxiv.org/abs/2603.03626v1
- Date: Wed, 04 Mar 2026 01:29:35 GMT
- Title: Riemannian Langevin Dynamics: Strong Convergence of Geometric Euler-Maruyama Scheme
- Authors: Zhiyuan Zhan, Masashi Sugiyama,
- Abstract summary: Low-dimensional structure in real-world data plays an important role in the success of generative models.<n>We prove convergence theory of numerical schemes for manifold-valued differential equations.
- Score: 51.56484100374058
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Low-dimensional structure in real-world data plays an important role in the success of generative models, which motivates diffusion models defined on intrinsic data manifolds. Such models are driven by stochastic differential equations (SDEs) on manifolds, which raises the need for convergence theory of numerical schemes for manifold-valued SDEs. In Euclidean space, the Euler--Maruyama (EM) scheme achieves strong convergence with order $1/2$, but an analogous result for manifold discretizations is less understood in general settings. In this work, we study a geometric version of the EM scheme for SDEs on Riemannian manifolds and prove strong convergence with order $1/2$ under geometric and regularity conditions. As an application, we obtain a Wasserstein bound for sampling on manifolds via the geometric EM discretization of Riemannian Langevin dynamics.
Related papers
- Riemannian Consistency Model [57.933800575074535]
We propose the Riemannian Consistency Model (RCM), which, for the first time, enables few-step consistency modeling.<n>We derive the closed-form solutions for both discrete- and continuous-time training objectives for RCM.<n>We provide a unique kinematics perspective for interpreting the RCM objective, offering new theoretical angles.
arXiv Detail & Related papers (2025-10-01T14:57:25Z) - Follow the Energy, Find the Path: Riemannian Metrics from Energy-Based Models [63.331590876872944]
We propose a method for deriving Riemannian metrics directly from pretrained Energy-Based Models.<n>These metrics define spatially varying distances, enabling the computation of geodesics.<n>We show that EBM-derived metrics consistently outperform established baselines.
arXiv Detail & Related papers (2025-05-23T12:18:08Z) - Riemannian Denoising Diffusion Probabilistic Models [7.964790563398277]
We propose RDDPMs for learning distributions on submanifolds of Euclidean space that are level sets of functions.<n>We provide a theoretical analysis of our method in the continuous-time limit.<n>The capability of our method is demonstrated on datasets from previous studies and on new sampled datasets.
arXiv Detail & Related papers (2025-05-07T11:37:16Z) - Linear Convergence of Diffusion Models Under the Manifold Hypothesis [5.040884755454258]
We show that the number of steps to converge in Kullback-Leibler(KL) divergence is linear (up to logarithmic terms) in the intrinsic dimension $Leid$.<n>We also show that this linear dependency is sharp.
arXiv Detail & Related papers (2024-10-11T17:58:30Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Generative Modeling on Manifolds Through Mixture of Riemannian Diffusion Processes [57.396578974401734]
We introduce a principled framework for building a generative diffusion process on general manifold.
Instead of following the denoising approach of previous diffusion models, we construct a diffusion process using a mixture of bridge processes.
We develop a geometric understanding of the mixture process, deriving the drift as a weighted mean of tangent directions to the data points.
arXiv Detail & Related papers (2023-10-11T06:04:40Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Riemannian Score-Based Generative Modeling [56.20669989459281]
We introduce score-based generative models (SGMs) demonstrating remarkable empirical performance.
Current SGMs make the underlying assumption that the data is supported on a Euclidean manifold with flat geometry.
This prevents the use of these models for applications in robotics, geoscience or protein modeling.
arXiv Detail & Related papers (2022-02-06T11:57:39Z) - Nested Grassmannians for Dimensionality Reduction with Applications [7.106986689736826]
We propose a novel framework for constructing a nested sequence of homogeneous Riemannian manifold.
We focus on applying the proposed framework to the Grassmann manifold, giving rise to the nested Grassmannians (NG)
Specifically, each planar (2D) shape can be represented as a point in the complex projective space which is a complex Grass-mann manifold.
With the proposed NG structure, we develop algorithms for the supervised and unsupervised dimensionality reduction problems respectively.
arXiv Detail & Related papers (2020-10-27T20:09:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.