Identifying latent distances with Finslerian geometry
- URL: http://arxiv.org/abs/2212.10010v2
- Date: Wed, 11 Oct 2023 17:51:40 GMT
- Title: Identifying latent distances with Finslerian geometry
- Authors: Alison Pouplin, David Eklund, Carl Henrik Ek, S{\o}ren Hauberg
- Abstract summary: Generative models cause the data space and the geodesics to be at best impractical, and at worst impossible to manipulate.
In this work, we propose another metric whose geodesics explicitly minimise the expected length of the pullback metric.
In high dimensions, we prove that both metrics converge to each other at a rate of $Oleft(frac1Dright)$.
- Score: 6.0188611984807245
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Riemannian geometry provides us with powerful tools to explore the latent
space of generative models while preserving the underlying structure of the
data. The latent space can be equipped it with a Riemannian metric, pulled back
from the data manifold. With this metric, we can systematically navigate the
space relying on geodesics defined as the shortest curves between two points.
Generative models are often stochastic, causing the data space, the Riemannian
metric, and the geodesics, to be stochastic as well. Stochastic objects are at
best impractical, and at worst impossible, to manipulate. A common solution is
to approximate the stochastic pullback metric by its expectation. But the
geodesics derived from this expected Riemannian metric do not correspond to the
expected length-minimising curves. In this work, we propose another metric
whose geodesics explicitly minimise the expected length of the pullback metric.
We show this metric defines a Finsler metric, and we compare it with the
expected Riemannian metric. In high dimensions, we prove that both metrics
converge to each other at a rate of $O\left(\frac{1}{D}\right)$. This
convergence implies that the established expected Riemannian metric is an
accurate approximation of the theoretically more grounded Finsler metric. This
provides justification for using the expected Riemannian metric for practical
implementations.
Related papers
- On Probabilistic Pullback Metrics on Latent Hyperbolic Manifolds [5.724027955589408]
This paper focuses on the hyperbolic manifold, a particularly suitable choice for modeling hierarchical relationships.
We propose augmenting the hyperbolic metric with a pullback metric to account for distortions introduced by theVM's nonlinear mapping.
Through various experiments, we demonstrate that geodesics on the pullback metric not only respect the geometry of the hyperbolic latent space but also align with the underlying data distribution.
arXiv Detail & Related papers (2024-10-28T09:13:00Z) - (Deep) Generative Geodesics [57.635187092922976]
We introduce a newian metric to assess the similarity between any two data points.
Our metric leads to the conceptual definition of generative distances and generative geodesics.
Their approximations are proven to converge to their true values under mild conditions.
arXiv Detail & Related papers (2024-07-15T21:14:02Z) - Intrinsic Bayesian Cramér-Rao Bound with an Application to Covariance Matrix Estimation [49.67011673289242]
This paper presents a new performance bound for estimation problems where the parameter to estimate lies in a smooth manifold.
It induces a geometry for the parameter manifold, as well as an intrinsic notion of the estimation error measure.
arXiv Detail & Related papers (2023-11-08T15:17:13Z) - The Fisher-Rao geometry of CES distributions [50.50897590847961]
The Fisher-Rao information geometry allows for leveraging tools from differential geometry.
We will present some practical uses of these geometric tools in the framework of elliptical distributions.
arXiv Detail & Related papers (2023-10-02T09:23:32Z) - Warped geometric information on the optimisation of Euclidean functions [43.43598316339732]
We consider optimisation of a real-valued function defined in a potentially high-dimensional Euclidean space.
We find the function's optimum along a manifold with a warped metric.
Our proposed algorithm, using 3rd-order approximation of geodesics, tends to outperform standard Euclidean gradient-based counterparts.
arXiv Detail & Related papers (2023-08-16T12:08:50Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Short and Straight: Geodesics on Differentiable Manifolds [6.85316573653194]
In this work, we first analyse existing methods for computing length-minimising geodesics.
Second, we propose a model-based parameterisation for distance fields and geodesic flows on continuous manifold.
Third, we develop a curvature-based training mechanism, sampling and scaling points in regions of the manifold exhibiting larger values of the Ricci scalar.
arXiv Detail & Related papers (2023-05-24T15:09:41Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z) - Ultrahyperbolic Representation Learning [13.828165530602224]
In machine learning, data is usually represented in a (flat) Euclidean space where distances between points are along straight lines.
We propose a representation living on a pseudo-Riemannian manifold of constant nonzero curvature.
We provide the necessary learning tools in this geometry and extend gradient-based optimization techniques.
arXiv Detail & Related papers (2020-07-01T03:49:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.