Riemannian Metric Learning: Closer to You than You Imagine
- URL: http://arxiv.org/abs/2503.05321v1
- Date: Fri, 07 Mar 2025 11:00:29 GMT
- Title: Riemannian Metric Learning: Closer to You than You Imagine
- Authors: Samuel Gruffaz, Josua Sassen,
- Abstract summary: Review provides a structured and accessible overview of key methods, applications, and recent advances.<n>Describes a powerful generalization that leverages differential geometry to model the data according to their underlying Riemannian manifold.<n> argues that this review should serve as a valuable resource for researchers and practitioners.
- Score: 1.6574413179773761
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Riemannian metric learning is an emerging field in machine learning, unlocking new ways to encode complex data structures beyond traditional distance metric learning. While classical approaches rely on global distances in Euclidean space, they often fall short in capturing intrinsic data geometry. Enter Riemannian metric learning: a powerful generalization that leverages differential geometry to model the data according to their underlying Riemannian manifold. This approach has demonstrated remarkable success across diverse domains, from causal inference and optimal transport to generative modeling and representation learning. In this review, we bridge the gap between classical metric learning and Riemannian geometry, providing a structured and accessible overview of key methods, applications, and recent advances. We argue that Riemannian metric learning is not merely a technical refinement but a fundamental shift in how we think about data representations. Thus, this review should serve as a valuable resource for researchers and practitioners interested in exploring Riemannian metric learning and convince them that it is closer to them than they might imagine-both in theory and in practice.
Related papers
- Riemann$^2$: Learning Riemannian Submanifolds from Riemannian Data [12.424539896723603]
Latent variable models are powerful tools for learning low-dimensional manifold from high-dimensional data.<n>This paper generalizes previous work and allows us to handle complex tasks in various domains, including robot motion synthesis and analysis of brain connectomes.
arXiv Detail & Related papers (2025-03-07T16:08:53Z) - Score-based pullback Riemannian geometry [10.649159213723106]
We propose a framework for data-driven Riemannian geometry that is scalable in both geometry and learning.
We produce high-quality geodesics through the data support and reliably estimates the intrinsic dimension of the data manifold.
Our framework can naturally be used with anisotropic normalizing flows by adopting isometry regularization during training.
arXiv Detail & Related papers (2024-10-02T18:52:12Z) - RMLR: Extending Multinomial Logistic Regression into General Geometries [64.16104856124029]
Our framework only requires minimal geometric properties, thus exhibiting broad applicability.
We develop five families of SPD MLRs under five types of power-deformed metrics.
On rotation matrices we propose Lie MLR based on the popular bi-invariant metric.
arXiv Detail & Related papers (2024-09-28T18:38:21Z) - Isometric Immersion Learning with Riemannian Geometry [4.987314374901577]
There is still no manifold learning method that provides a theoretical guarantee of isometry.
Inspired by Nash's isometric theorem, we introduce a new concept called isometric immersion learning.
An unsupervised neural network-based model that simultaneously achieves metric and manifold learning is proposed.
arXiv Detail & Related papers (2024-09-23T07:17:06Z) - (Deep) Generative Geodesics [57.635187092922976]
We introduce a newian metric to assess the similarity between any two data points.
Our metric leads to the conceptual definition of generative distances and generative geodesics.
Their approximations are proven to converge to their true values under mild conditions.
arXiv Detail & Related papers (2024-07-15T21:14:02Z) - A singular Riemannian Geometry Approach to Deep Neural Networks III. Piecewise Differentiable Layers and Random Walks on $n$-dimensional Classes [49.32130498861987]
We study the case of non-differentiable activation functions, such as ReLU.
Two recent works introduced a geometric framework to study neural networks.
We illustrate our findings with some numerical experiments on classification of images and thermodynamic problems.
arXiv Detail & Related papers (2024-04-09T08:11:46Z) - Unraveling the Single Tangent Space Fallacy: An Analysis and Clarification for Applying Riemannian Geometry in Robot Learning [6.253089330116833]
Handling geometric constraints effectively requires the incorporation of tools from differential geometry into the formulation of machine learning methods.
Recent adoption in robot learning has been largely characterized by a mathematically-flawed simplification.
This paper provides a theoretical elucidation of various misconceptions surrounding this approach and offers experimental evidence of its shortcomings.
arXiv Detail & Related papers (2023-10-11T21:16:01Z) - The Fisher-Rao geometry of CES distributions [50.50897590847961]
The Fisher-Rao information geometry allows for leveraging tools from differential geometry.
We will present some practical uses of these geometric tools in the framework of elliptical distributions.
arXiv Detail & Related papers (2023-10-02T09:23:32Z) - Neural Bregman Divergences for Distance Learning [60.375385370556145]
We propose a new approach to learning arbitrary Bregman divergences in a differentiable manner via input convex neural networks.
We show that our method more faithfully learns divergences over a set of both new and previously studied tasks.
Our tests further extend to known asymmetric, but non-Bregman tasks, where our method still performs competitively despite misspecification.
arXiv Detail & Related papers (2022-06-09T20:53:15Z) - A prior-based approximate latent Riemannian metric [3.716965622352967]
We propose a surrogate conformal generative metric in the latent space of a generative model that is simple, efficient and robust.
We theoretically analyze the behavior of the proposed metric and show that it is sensible to use in practice.
Also, we show the applicability of the proposed methodology for data analysis in the life sciences.
arXiv Detail & Related papers (2021-03-09T08:31:52Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.