Busemann Functions in the Wasserstein Space: Existence, Closed-Forms, and Applications to Slicing
- URL: http://arxiv.org/abs/2510.04579v1
- Date: Mon, 06 Oct 2025 08:31:14 GMT
- Title: Busemann Functions in the Wasserstein Space: Existence, Closed-Forms, and Applications to Slicing
- Authors: Clément Bonet, Elsa Cazelles, Lucas Drumetz, Nicolas Courty,
- Abstract summary: Busemann function has recently found much interest in a variety of machine learning problems.<n>We investigate the existence and computation of Busemann functions in Wasserstein space.
- Score: 13.473701044380938
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The Busemann function has recently found much interest in a variety of geometric machine learning problems, as it naturally defines projections onto geodesic rays of Riemannian manifolds and generalizes the notion of hyperplanes. As several sources of data can be conveniently modeled as probability distributions, it is natural to study this function in the Wasserstein space, which carries a rich formal Riemannian structure induced by Optimal Transport metrics. In this work, we investigate the existence and computation of Busemann functions in Wasserstein space, which admits geodesic rays. We establish closed-form expressions in two important cases: one-dimensional distributions and Gaussian measures. These results enable explicit projection schemes for probability distributions on $\mathbb{R}$, which in turn allow us to define novel Sliced-Wasserstein distances over Gaussian mixtures and labeled datasets. We demonstrate the efficiency of those original schemes on synthetic datasets as well as transfer learning problems.
Related papers
- Riemannian Langevin Dynamics: Strong Convergence of Geometric Euler-Maruyama Scheme [51.56484100374058]
Low-dimensional structure in real-world data plays an important role in the success of generative models.<n>We prove convergence theory of numerical schemes for manifold-valued differential equations.
arXiv Detail & Related papers (2026-03-04T01:29:35Z) - Follow the Energy, Find the Path: Riemannian Metrics from Energy-Based Models [63.331590876872944]
We propose a method for deriving Riemannian metrics directly from pretrained Energy-Based Models.<n>These metrics define spatially varying distances, enabling the computation of geodesics.<n>We show that EBM-derived metrics consistently outperform established baselines.
arXiv Detail & Related papers (2025-05-23T12:18:08Z) - Riemannian Denoising Diffusion Probabilistic Models [7.964790563398277]
We propose RDDPMs for learning distributions on submanifolds of Euclidean space that are level sets of functions.<n>We provide a theoretical analysis of our method in the continuous-time limit.<n>The capability of our method is demonstrated on datasets from previous studies and on new sampled datasets.
arXiv Detail & Related papers (2025-05-07T11:37:16Z) - Riemann$^2$: Learning Riemannian Submanifolds from Riemannian Data [12.424539896723603]
Latent variable models are powerful tools for learning low-dimensional manifold from high-dimensional data.<n>This paper generalizes previous work and allows us to handle complex tasks in various domains, including robot motion synthesis and analysis of brain connectomes.
arXiv Detail & Related papers (2025-03-07T16:08:53Z) - Wasserstein Flow Matching: Generative modeling over families of distributions [13.620905707751747]
We propose Wasserstein flow matching (WFM), which lifts flow matching onto families of distributions using the Wasserstein geometry.<n> Notably, WFM is the first algorithm capable of generating distributions in high dimensions, whether represented analytically (as Gaussians) or empirically (as point-clouds)
arXiv Detail & Related papers (2024-11-01T15:55:07Z) - Polynomial Chaos Expansions on Principal Geodesic Grassmannian
Submanifolds for Surrogate Modeling and Uncertainty Quantification [0.41709348827585524]
We introduce a manifold learning-based surrogate modeling framework for uncertainty in high-dimensional systems.
We employ Principal Geodesic Analysis on the Grassmann manifold of the response to identify a set of disjoint principal geodesic submanifolds.
Polynomial chaos expansion is then used to construct a mapping between the random input parameters and the projection of the response.
arXiv Detail & Related papers (2024-01-30T02:13:02Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Stationary Kernels and Gaussian Processes on Lie Groups and their Homogeneous Spaces I: the compact case [43.877478563933316]
In to symmetries is one of the most fundamental forms of prior information one can consider.
In this work, we develop constructive and practical techniques for building stationary Gaussian processes on a very large class of non-Euclidean spaces.
arXiv Detail & Related papers (2022-08-31T16:40:40Z) - Riemannian Convex Potential Maps [28.39224890275125]
We propose and study a class of flows that uses convex potentials from Riemannian optimal transport.
We demonstrate that these flows can model standard distributions on spheres, and tori, on synthetic and geological data.
arXiv Detail & Related papers (2021-06-18T17:59:06Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z) - Learning High Dimensional Wasserstein Geodesics [55.086626708837635]
We propose a new formulation and learning strategy for computing the Wasserstein geodesic between two probability distributions in high dimensions.
By applying the method of Lagrange multipliers to the dynamic formulation of the optimal transport (OT) problem, we derive a minimax problem whose saddle point is the Wasserstein geodesic.
We then parametrize the functions by deep neural networks and design a sample based bidirectional learning algorithm for training.
arXiv Detail & Related papers (2021-02-05T04:25:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.