An Intrinsic Approach to Scalar-Curvature Estimation for Point Clouds
- URL: http://arxiv.org/abs/2308.02615v1
- Date: Fri, 4 Aug 2023 14:29:50 GMT
- Title: An Intrinsic Approach to Scalar-Curvature Estimation for Point Clouds
- Authors: Abigail Hickok and Andrew J. Blumberg
- Abstract summary: We introduce an intrinsic estimator for the scalar curvature of a data set presented as a finite metric space.
Our estimator depends only on the metric structure of the data and not on an embedding in $mathbbRn$.
- Score: 3.2634122554914
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce an intrinsic estimator for the scalar curvature of a data set
presented as a finite metric space. Our estimator depends only on the metric
structure of the data and not on an embedding in $\mathbb{R}^n$. We show that
the estimator is consistent in the sense that for points sampled from a
probability measure on a compact Riemannian manifold, the estimator converges
to the scalar curvature as the number of points increases. To justify its use
in applications, we show that the estimator is stable with respect to
perturbations of the metric structure, e.g., noise in the sample or error
estimating the intrinsic metric. We validate our estimator experimentally on
synthetic data that is sampled from manifolds with specified curvature.
Related papers
- A Geometric Unification of Distributionally Robust Covariance Estimators: Shrinking the Spectrum by Inflating the Ambiguity Set [20.166217494056916]
We propose a principled approach to construct covariance estimators without imposing restrictive assumptions.
We show that our robust estimators are efficiently computable and consistent.
Numerical experiments based on synthetic and real data show that our robust estimators are competitive with state-of-the-art estimators.
arXiv Detail & Related papers (2024-05-30T15:01:18Z) - Intrinsic Bayesian Cramér-Rao Bound with an Application to Covariance Matrix Estimation [49.67011673289242]
This paper presents a new performance bound for estimation problems where the parameter to estimate lies in a smooth manifold.
It induces a geometry for the parameter manifold, as well as an intrinsic notion of the estimation error measure.
arXiv Detail & Related papers (2023-11-08T15:17:13Z) - Shape And Structure Preserving Differential Privacy [70.08490462870144]
We show how the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
We also show how using the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
arXiv Detail & Related papers (2022-09-21T18:14:38Z) - Score Matching for Truncated Density Estimation on a Manifold [6.53626518989653]
Recent methods propose to use score matching for truncated density estimation.
We present a novel extension of truncated score matching to a Riemannian manifold with boundary.
In simulated data experiments, our score matching estimator is able to approximate the true parameter values with a low estimation error.
arXiv Detail & Related papers (2022-06-29T14:14:49Z) - Tangent Space and Dimension Estimation with the Wasserstein Distance [10.118241139691952]
Consider a set of points sampled independently near a smooth compact submanifold of Euclidean space.
We provide mathematically rigorous bounds on the number of sample points required to estimate both the dimension and the tangent spaces of that manifold.
arXiv Detail & Related papers (2021-10-12T21:02:06Z) - Intrinsic persistent homology via density-based metric learning [1.0499611180329804]
We prove that the metric space defined by the sample endowed with a computable metric known as sample Fermat distance converges a.s.
The limiting object is the manifold itself endowed with the population Fermat distance, an intrinsic metric that accounts for both the geometry of the manifold and the density that produces the sample.
arXiv Detail & Related papers (2020-12-11T18:54:36Z) - $\gamma$-ABC: Outlier-Robust Approximate Bayesian Computation Based on a
Robust Divergence Estimator [95.71091446753414]
We propose to use a nearest-neighbor-based $gamma$-divergence estimator as a data discrepancy measure.
Our method achieves significantly higher robustness than existing discrepancy measures.
arXiv Detail & Related papers (2020-06-13T06:09:27Z) - Nonparametric Score Estimators [49.42469547970041]
Estimating the score from a set of samples generated by an unknown distribution is a fundamental task in inference and learning of probabilistic models.
We provide a unifying view of these estimators under the framework of regularized nonparametric regression.
We propose score estimators based on iterative regularization that enjoy computational benefits from curl-free kernels and fast convergence.
arXiv Detail & Related papers (2020-05-20T15:01:03Z) - Nonparametric Estimation of the Fisher Information and Its Applications [82.00720226775964]
This paper considers the problem of estimation of the Fisher information for location from a random sample of size $n$.
An estimator proposed by Bhattacharya is revisited and improved convergence rates are derived.
A new estimator, termed a clipped estimator, is proposed.
arXiv Detail & Related papers (2020-05-07T17:21:56Z) - Estimating Gradients for Discrete Random Variables by Sampling without
Replacement [93.09326095997336]
We derive an unbiased estimator for expectations over discrete random variables based on sampling without replacement.
We show that our estimator can be derived as the Rao-Blackwellization of three different estimators.
arXiv Detail & Related papers (2020-02-14T14:15:18Z) - Finite sample properties of parametric MMD estimation: robustness to misspecification and dependence [7.011897575776511]
We show that the estimator is robust to both dependence and to the presence of outliers in the dataset.
We provide a theoretical study of the gradient descent algorithm used to compute the estimator.
arXiv Detail & Related papers (2019-12-12T02:28:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.