Intrinsic persistent homology via density-based metric learning
- URL: http://arxiv.org/abs/2012.07621v1
- Date: Fri, 11 Dec 2020 18:54:36 GMT
- Title: Intrinsic persistent homology via density-based metric learning
- Authors: Eugenio Borghini, Ximena Fern\'andez, Pablo Groisman, Gabriel Mindlin
- Abstract summary: We prove that the metric space defined by the sample endowed with a computable metric known as sample Fermat distance converges a.s.
The limiting object is the manifold itself endowed with the population Fermat distance, an intrinsic metric that accounts for both the geometry of the manifold and the density that produces the sample.
- Score: 1.0499611180329804
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We address the problem of estimating intrinsic distances in a manifold from a
finite sample. We prove that the metric space defined by the sample endowed
with a computable metric known as sample Fermat distance converges a.s. in the
sense of Gromov-Hausdorff. The limiting object is the manifold itself endowed
with the population Fermat distance, an intrinsic metric that accounts for both
the geometry of the manifold and the density that produces the sample. This
result is applied to obtain sample persistence diagrams that converge towards
an intrinsic persistence diagram. We show that this method outperforms more
standard approaches based on Euclidean norm with theoretical results and
computational experiments.
Related papers
- Convergence of Score-Based Discrete Diffusion Models: A Discrete-Time Analysis [56.442307356162864]
We study the theoretical aspects of score-based discrete diffusion models under the Continuous Time Markov Chain (CTMC) framework.
We introduce a discrete-time sampling algorithm in the general state space $[S]d$ that utilizes score estimators at predefined time points.
Our convergence analysis employs a Girsanov-based method and establishes key properties of the discrete score function.
arXiv Detail & Related papers (2024-10-03T09:07:13Z) - Computing distances and means on manifolds with a metric-constrained Eikonal approach [4.266376725904727]
We introduce the metric-constrained Eikonal solver to obtain continuous, differentiable representations of distance functions.
The differentiable nature of these representations allows for the direct computation of globally length-minimising paths on the manifold.
arXiv Detail & Related papers (2024-04-12T18:26:32Z) - Sampling and estimation on manifolds using the Langevin diffusion [45.57801520690309]
Two estimators of linear functionals of $mu_phi $ based on the discretized Markov process are considered.
Error bounds are derived for sampling and estimation using a discretization of an intrinsically defined Langevin diffusion.
arXiv Detail & Related papers (2023-12-22T18:01:11Z) - Intrinsic Bayesian Cramér-Rao Bound with an Application to Covariance Matrix Estimation [49.67011673289242]
This paper presents a new performance bound for estimation problems where the parameter to estimate lies in a smooth manifold.
It induces a geometry for the parameter manifold, as well as an intrinsic notion of the estimation error measure.
arXiv Detail & Related papers (2023-11-08T15:17:13Z) - Fermat Distances: Metric Approximation, Spectral Convergence, and
Clustering Algorithms [7.07321040534471]
We prove that Fermat distances converge to their continuum analogues in small neighborhoods with a precise rate that depends on the intrinsic dimensionality of the data.
Our results are then used to prove that discrete graph Laplacians based on discrete, sample-driven Fermat distances converge to corresponding continuum operators.
The perspective afforded by our discrete-to-continuum Fermat distance analysis leads to new clustering algorithms for data.
arXiv Detail & Related papers (2023-07-07T16:36:00Z) - Learning Discretized Neural Networks under Ricci Flow [51.36292559262042]
We study Discretized Neural Networks (DNNs) composed of low-precision weights and activations.
DNNs suffer from either infinite or zero gradients due to the non-differentiable discrete function during training.
arXiv Detail & Related papers (2023-02-07T10:51:53Z) - Convergence of the Riemannian Langevin Algorithm [10.279748604797911]
We study the problem of sampling from a distribution with density $nu$ with respect to the natural measure on a manifold with metric $g$.
A special case of our approach is sampling isoperimetric densities restricted to polytopes defined by the logarithmic barrier.
arXiv Detail & Related papers (2022-04-22T16:56:00Z) - Tangent Space and Dimension Estimation with the Wasserstein Distance [10.118241139691952]
Consider a set of points sampled independently near a smooth compact submanifold of Euclidean space.
We provide mathematically rigorous bounds on the number of sample points required to estimate both the dimension and the tangent spaces of that manifold.
arXiv Detail & Related papers (2021-10-12T21:02:06Z) - Kernel distance measures for time series, random fields and other
structured data [71.61147615789537]
kdiff is a novel kernel-based measure for estimating distances between instances of structured data.
It accounts for both self and cross similarities across the instances and is defined using a lower quantile of the distance distribution.
Some theoretical results are provided for separability conditions using kdiff as a distance measure for clustering and classification problems.
arXiv Detail & Related papers (2021-09-29T22:54:17Z) - Generative Modeling with Denoising Auto-Encoders and Langevin Sampling [88.83704353627554]
We show that both DAE and DSM provide estimates of the score of the smoothed population density.
We then apply our results to the homotopy method of arXiv:1907.05600 and provide theoretical justification for its empirical success.
arXiv Detail & Related papers (2020-01-31T23:50:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.