Sliced-Wasserstein Distances and Flows on Cartan-Hadamard Manifolds
- URL: http://arxiv.org/abs/2403.06560v1
- Date: Mon, 11 Mar 2024 10:01:21 GMT
- Title: Sliced-Wasserstein Distances and Flows on Cartan-Hadamard Manifolds
- Authors: Cl\'ement Bonet and Lucas Drumetz and Nicolas Courty
- Abstract summary: We derive general constructions of Sliced-Wasserstein distances on Cartimatan-Hadamard manifold.
We also propose non-parametric schemes to minimize these new distances by approxing their Wasserstein gradient flows.
- Score: 13.851780805245477
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: While many Machine Learning methods were developed or transposed on
Riemannian manifolds to tackle data with known non Euclidean geometry, Optimal
Transport (OT) methods on such spaces have not received much attention. The
main OT tool on these spaces is the Wasserstein distance which suffers from a
heavy computational burden. On Euclidean spaces, a popular alternative is the
Sliced-Wasserstein distance, which leverages a closed-form solution of the
Wasserstein distance in one dimension, but which is not readily available on
manifolds. In this work, we derive general constructions of Sliced-Wasserstein
distances on Cartan-Hadamard manifolds, Riemannian manifolds with non-positive
curvature, which include among others Hyperbolic spaces or the space of
Symmetric Positive Definite matrices. Then, we propose different applications.
Additionally, we derive non-parametric schemes to minimize these new distances
by approximating their Wasserstein gradient flows.
Related papers
- Continuous-time Riemannian SGD and SVRG Flows on Wasserstein Probabilistic Space [17.13355049019388]
We extend the gradient flow on Wasserstein space into the gradient descent (SGD) flow and variance reduction (SVRG) flow.
By leveraging the property of Wasserstein space, we construct differential equations to approximate the corresponding discrete dynamics in Euclidean space.
Our results are proven, which match the results in Euclidean space.
arXiv Detail & Related papers (2024-01-24T15:35:44Z) - Leveraging Optimal Transport via Projections on Subspaces for Machine
Learning Applications [0.0]
In this thesis, we focus on alternatives which use projections on subspaces.
The main such alternative is the Sliced-Wasserstein distance.
Back to the original Euclidean Sliced-Wasserstein distance between probability measures, we study the dynamic of gradient flows.
arXiv Detail & Related papers (2023-11-23T10:13:07Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Generative Modeling on Manifolds Through Mixture of Riemannian Diffusion Processes [57.396578974401734]
We introduce a principled framework for building a generative diffusion process on general manifold.
Instead of following the denoising approach of previous diffusion models, we construct a diffusion process using a mixture of bridge processes.
We develop a geometric understanding of the mixture process, deriving the drift as a weighted mean of tangent directions to the data points.
arXiv Detail & Related papers (2023-10-11T06:04:40Z) - Hyperbolic Sliced-Wasserstein via Geodesic and Horospherical Projections [17.48229977212902]
It has been shown beneficial for many types of data which present an underlying hierarchical structure to be embedded in hyperbolic spaces.
Many tools of machine learning were extended to such spaces, but only few discrepancies to compare probability distributions defined over those spaces exist.
In this work, we propose to derive novel hyperbolic sliced-Wasserstein discrepancies.
arXiv Detail & Related papers (2022-11-18T07:44:27Z) - Spherical Sliced-Wasserstein [14.98994743486746]
Sliced-Wasserstein distance (SW) is restricted to data living in Euclidean spaces.
We focus more specifically on the sphere, for which we define a novel SW discrepancy, which we call spherical Sliced-Wasserstein.
Our construction is notably based on closed-form solutions of the Wasserstein distance on the circle, together with a new spherical Radon transform.
arXiv Detail & Related papers (2022-06-17T13:48:50Z) - Neural Bregman Divergences for Distance Learning [60.375385370556145]
We propose a new approach to learning arbitrary Bregman divergences in a differentiable manner via input convex neural networks.
We show that our method more faithfully learns divergences over a set of both new and previously studied tasks.
Our tests further extend to known asymmetric, but non-Bregman tasks, where our method still performs competitively despite misspecification.
arXiv Detail & Related papers (2022-06-09T20:53:15Z) - A Graph-based approach to derive the geodesic distance on Statistical
manifolds: Application to Multimedia Information Retrieval [5.1388648724853825]
We leverage the properties of non-Euclidean Geometry to define the Geodesic distance.
We propose an approximation of the Geodesic distance through a graph-based method.
Our main aim is to compare the graph-based approximation to the state of the art approximations.
arXiv Detail & Related papers (2021-06-26T16:39:54Z) - Manifold Learning via Manifold Deflation [105.7418091051558]
dimensionality reduction methods provide a valuable means to visualize and interpret high-dimensional data.
Many popular methods can fail dramatically, even on simple two-dimensional Manifolds.
This paper presents an embedding method for a novel, incremental tangent space estimator that incorporates global structure as coordinates.
Empirically, we show our algorithm recovers novel and interesting embeddings on real-world and synthetic datasets.
arXiv Detail & Related papers (2020-07-07T10:04:28Z) - On Projection Robust Optimal Transport: Sample Complexity and Model
Misspecification [101.0377583883137]
Projection robust (PR) OT seeks to maximize the OT cost between two measures by choosing a $k$-dimensional subspace onto which they can be projected.
Our first contribution is to establish several fundamental statistical properties of PR Wasserstein distances.
Next, we propose the integral PR Wasserstein (IPRW) distance as an alternative to the PRW distance, by averaging rather than optimizing on subspaces.
arXiv Detail & Related papers (2020-06-22T14:35:33Z) - A diffusion approach to Stein's method on Riemannian manifolds [65.36007959755302]
We exploit the relationship between the generator of a diffusion on $mathbf M$ with target invariant measure and its characterising Stein operator.
We derive Stein factors, which bound the solution to the Stein equation and its derivatives.
We imply that the bounds for $mathbb Rm$ remain valid when $mathbf M$ is a flat manifold.
arXiv Detail & Related papers (2020-03-25T17:03:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.