Principal subbundles for dimension reduction
- URL: http://arxiv.org/abs/2307.03128v1
- Date: Thu, 6 Jul 2023 16:55:21 GMT
- Title: Principal subbundles for dimension reduction
- Authors: Morten Akh{\o}j, James Benn, Erlend Grong, Stefan Sommer, Xavier
Pennec
- Abstract summary: We show how sub-Riemannian geometry can be used for manifold learning and surface reconstruction.
We show that the framework is robust when applied to noisy data.
- Score: 0.07515511160657122
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper we demonstrate how sub-Riemannian geometry can be used for
manifold learning and surface reconstruction by combining local linear
approximations of a point cloud to obtain lower dimensional bundles. Local
approximations obtained by local PCAs are collected into a rank $k$ tangent
subbundle on $\mathbb{R}^d$, $k<d$, which we call a principal subbundle. This
determines a sub-Riemannian metric on $\mathbb{R}^d$. We show that
sub-Riemannian geodesics with respect to this metric can successfully be
applied to a number of important problems, such as: explicit construction of an
approximating submanifold $M$, construction of a representation of the
point-cloud in $\mathbb{R}^k$, and computation of distances between
observations, taking the learned geometry into account. The reconstruction is
guaranteed to equal the true submanifold in the limit case where tangent spaces
are estimated exactly. Via simulations, we show that the framework is robust
when applied to noisy data. Furthermore, the framework generalizes to
observations on an a priori known Riemannian manifold.
Related papers
- Score-based pullback Riemannian geometry [10.649159213723106]
We propose a framework for data-driven Riemannian geometry that is scalable in both geometry and learning.
We produce high-quality geodesics through the data support and reliably estimates the intrinsic dimension of the data manifold.
Our framework can naturally be used with anisotropic normalizing flows by adopting isometry regularization during training.
arXiv Detail & Related papers (2024-10-02T18:52:12Z) - Reconstructing the Geometry of Random Geometric Graphs [9.004991291124096]
Random geometric graphs are random graph models defined on metric spaces.
We show how to efficiently reconstruct the geometry of the underlying space from the sampled graph.
arXiv Detail & Related papers (2024-02-14T21:34:44Z) - Intrinsic Bayesian Cramér-Rao Bound with an Application to Covariance Matrix Estimation [49.67011673289242]
This paper presents a new performance bound for estimation problems where the parameter to estimate lies in a smooth manifold.
It induces a geometry for the parameter manifold, as well as an intrinsic notion of the estimation error measure.
arXiv Detail & Related papers (2023-11-08T15:17:13Z) - The Fisher-Rao geometry of CES distributions [50.50897590847961]
The Fisher-Rao information geometry allows for leveraging tools from differential geometry.
We will present some practical uses of these geometric tools in the framework of elliptical distributions.
arXiv Detail & Related papers (2023-10-02T09:23:32Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Visualizing Riemannian data with Rie-SNE [0.0]
We extend the classic neighbor embedding algorithm to data on general Riemannian manifold.
We replace standard assumptions with Riemannian diffusion counterparts and propose an efficient approximation.
We demonstrate that the approach also allows for mapping data from one manifold to another, e.g. from a high-dimensional sphere to a low-dimensional one.
arXiv Detail & Related papers (2022-03-17T11:21:44Z) - A singular Riemannian geometry approach to Deep Neural Networks I.
Theoretical foundations [77.86290991564829]
Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis.
We study a particular sequence of maps between manifold, with the last manifold of the sequence equipped with a Riemannian metric.
We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between implementing neural networks of practical interest.
arXiv Detail & Related papers (2021-12-17T11:43:30Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z) - Computationally Tractable Riemannian Manifolds for Graph Embeddings [10.420394952839242]
We show how to learn and optimize graph embeddings in certain curved Riemannian spaces.
Our results serve as new evidence for the benefits of non-Euclidean embeddings in machine learning pipelines.
arXiv Detail & Related papers (2020-02-20T10:55:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.