Non-Parametric Manifold Learning
- URL: http://arxiv.org/abs/2107.08089v3
- Date: Mon, 15 May 2023 19:04:28 GMT
- Title: Non-Parametric Manifold Learning
- Authors: Dena Marie Asta
- Abstract summary: We introduce an estimator for distances in a compact Riemannian manifold based on graph Laplacian estimates of the Laplace-Beltrami operator.
A consequence is a proof of consistency for (untruncated) manifold distances.
The estimator resembles, and in fact its convergence properties are derived from, a special case of the Kontorovic dual reformulation of Wasserstein distance known as Connes' Distance Formula.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce an estimator for distances in a compact Riemannian manifold
based on graph Laplacian estimates of the Laplace-Beltrami operator. We upper
bound the error in the estimate of manifold distances, or more precisely an
estimate of a spectrally truncated variant of manifold distance of interest in
non-commutative geometry (cf. [Connes and Suijelekom, 2020]), in terms of
spectral errors in the graph Laplacian estimates and, implicitly, several
geometric properties of the manifold. A consequence is a proof of consistency
for (untruncated) manifold distances. The estimator resembles, and in fact its
convergence properties are derived from, a special case of the Kontorovic dual
reformulation of Wasserstein distance known as Connes' Distance Formula.
Related papers
- Consistent Estimation of a Class of Distances Between Covariance Matrices [7.291687946822539]
We are interested in the family of distances that can be expressed as sums of traces of functions that are separately applied to each covariance matrix.
A statistical analysis of the behavior of this class of distance estimators has also been conducted.
We present a central limit theorem that establishes the Gaussianity of these estimators and provides closed form expressions for the corresponding means and variances.
arXiv Detail & Related papers (2024-09-18T07:36:25Z) - Multivariate root-n-consistent smoothing parameter free matching estimators and estimators of inverse density weighted expectations [51.000851088730684]
We develop novel modifications of nearest-neighbor and matching estimators which converge at the parametric $sqrt n $-rate.
We stress that our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent parameters smoothing.
arXiv Detail & Related papers (2024-07-11T13:28:34Z) - Product Geometries on Cholesky Manifolds with Applications to SPD Manifolds [65.04845593770727]
We present two new metrics on the Symmetric Positive Definite (SPD) manifold via the Cholesky manifold.
Our metrics are easy to use, computationally efficient, and numerically stable.
arXiv Detail & Related papers (2024-07-02T18:46:13Z) - Intrinsic Bayesian Cramér-Rao Bound with an Application to Covariance Matrix Estimation [49.67011673289242]
This paper presents a new performance bound for estimation problems where the parameter to estimate lies in a smooth manifold.
It induces a geometry for the parameter manifold, as well as an intrinsic notion of the estimation error measure.
arXiv Detail & Related papers (2023-11-08T15:17:13Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Shape And Structure Preserving Differential Privacy [70.08490462870144]
We show how the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
We also show how using the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
arXiv Detail & Related papers (2022-09-21T18:14:38Z) - Tangent Space and Dimension Estimation with the Wasserstein Distance [10.118241139691952]
Consider a set of points sampled independently near a smooth compact submanifold of Euclidean space.
We provide mathematically rigorous bounds on the number of sample points required to estimate both the dimension and the tangent spaces of that manifold.
arXiv Detail & Related papers (2021-10-12T21:02:06Z) - A Graph-based approach to derive the geodesic distance on Statistical
manifolds: Application to Multimedia Information Retrieval [5.1388648724853825]
We leverage the properties of non-Euclidean Geometry to define the Geodesic distance.
We propose an approximation of the Geodesic distance through a graph-based method.
Our main aim is to compare the graph-based approximation to the state of the art approximations.
arXiv Detail & Related papers (2021-06-26T16:39:54Z) - Intrinsic persistent homology via density-based metric learning [1.0499611180329804]
We prove that the metric space defined by the sample endowed with a computable metric known as sample Fermat distance converges a.s.
The limiting object is the manifold itself endowed with the population Fermat distance, an intrinsic metric that accounts for both the geometry of the manifold and the density that produces the sample.
arXiv Detail & Related papers (2020-12-11T18:54:36Z) - Disentangling by Subspace Diffusion [72.1895236605335]
We show that fully unsupervised factorization of a data manifold is possible if the true metric of the manifold is known.
Our work reduces the question of whether unsupervised metric learning is possible, providing a unifying insight into the geometric nature of representation learning.
arXiv Detail & Related papers (2020-06-23T13:33:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.