Curvature as a tool for evaluating dimensionality reduction and estimating intrinsic dimension
- URL: http://arxiv.org/abs/2509.13385v1
- Date: Tue, 16 Sep 2025 10:14:47 GMT
- Title: Curvature as a tool for evaluating dimensionality reduction and estimating intrinsic dimension
- Authors: Charlotte Beylier, Parvaneh Joharinad, Jürgen Jost, Nahid Torbati,
- Abstract summary: We introduce a method for constructing a curvature-based geometric profile of discrete metric spaces.<n>More significantly, based on this curvature profile, we introduce a quantitative measure to evaluate the effectiveness of data representations.<n>Our experiments demonstrate that this curvature-based analysis can be employed to estimate the intrinsic dimensionality of datasets.
- Score: 1.3019517863608956
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Utilizing recently developed abstract notions of sectional curvature, we introduce a method for constructing a curvature-based geometric profile of discrete metric spaces. The curvature concept that we use here captures the metric relations between triples of points and other points. More significantly, based on this curvature profile, we introduce a quantitative measure to evaluate the effectiveness of data representations, such as those produced by dimensionality reduction techniques. Furthermore, Our experiments demonstrate that this curvature-based analysis can be employed to estimate the intrinsic dimensionality of datasets. We use this to explore the large-scale geometry of empirical networks and to evaluate the effectiveness of dimensionality reduction techniques.
Related papers
- Dimensionality Reduction on Riemannian Manifolds in Data Analysis [0.0]
Experimental results show improved representation quality and classification performance compared to Euclidean counterparts.<n>This study underscores the importance of geometry aware dimensionality reduction in modern machine learning and data science applications.
arXiv Detail & Related papers (2026-02-05T17:46:58Z) - A general framework for adaptive nonparametric dimensionality reduction [1.8424939331296903]
In this paper, we exploit a recently proposed intrinsic dimension estimator which also returns the optimal locally adaptive neighbourhood sizes.<n> Numerical experiments on both real-world and simulated datasets show that the proposed method can be used to significantly improve well-known projection methods.
arXiv Detail & Related papers (2025-11-12T16:59:22Z) - IIKL: Isometric Immersion Kernel Learning with Riemannian Manifold for Geometric Preservation [15.82760919569542]
Previous research generally mapped non-Euclidean data into Euclidean space during representation learning.<n>In this paper, we propose a novel Isometric Immersion Kernel Learning (IIKL) method.<n>We show that our method could reduce the inner product invariant loss by more than 90% compared to state-of-the-art methods.
arXiv Detail & Related papers (2025-05-07T12:08:33Z) - (Deep) Generative Geodesics [57.635187092922976]
We introduce a newian metric to assess the similarity between any two data points.
Our metric leads to the conceptual definition of generative distances and generative geodesics.
Their approximations are proven to converge to their true values under mild conditions.
arXiv Detail & Related papers (2024-07-15T21:14:02Z) - The Fisher-Rao geometry of CES distributions [50.50897590847961]
The Fisher-Rao information geometry allows for leveraging tools from differential geometry.
We will present some practical uses of these geometric tools in the framework of elliptical distributions.
arXiv Detail & Related papers (2023-10-02T09:23:32Z) - An evaluation framework for dimensionality reduction through sectional
curvature [59.40521061783166]
In this work, we aim to introduce the first highly non-supervised dimensionality reduction performance metric.
To test its feasibility, this metric has been used to evaluate the performance of the most commonly used dimension reduction algorithms.
A new parameterized problem instance generator has been constructed in the form of a function generator.
arXiv Detail & Related papers (2023-03-17T11:59:33Z) - Shape And Structure Preserving Differential Privacy [70.08490462870144]
We show how the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
We also show how using the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
arXiv Detail & Related papers (2022-09-21T18:14:38Z) - Spherical Rotation Dimension Reduction with Geometric Loss Functions [0.0]
A prime example of such a dataset is a collection of cell cycle measurements, where the inherently cyclical nature of the process can be represented as a circle or sphere.
We propose a nonlinear dimension reduction method, Spherical Rotation Component Analysis (SRCA), that incorporates geometric information to better approximate low-dimensional manifold.
arXiv Detail & Related papers (2022-04-23T02:03:55Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Joint Dimensionality Reduction for Separable Embedding Estimation [43.22422640265388]
Low-dimensional embeddings for data from disparate sources play critical roles in machine learning, multimedia information retrieval, and bioinformatics.
We propose a supervised dimensionality reduction method that learns linear embeddings jointly for two feature vectors representing data of different modalities or data from distinct types of entities.
Our approach compares favorably against other dimensionality reduction methods, and against a state-of-the-art method of bilinear regression for predicting gene-disease associations.
arXiv Detail & Related papers (2021-01-14T08:48:37Z) - Geometric Attention for Prediction of Differential Properties in 3D
Point Clouds [32.68259334785767]
In this study, we present a geometric attention mechanism that can provide such properties in a learnable fashion.
We establish the usefulness of the proposed technique with several experiments on the prediction of normal vectors and the extraction of feature lines.
arXiv Detail & Related papers (2020-07-06T07:40:26Z) - Deep Dimension Reduction for Supervised Representation Learning [51.10448064423656]
We propose a deep dimension reduction approach to learning representations with essential characteristics.
The proposed approach is a nonparametric generalization of the sufficient dimension reduction method.
We show that the estimated deep nonparametric representation is consistent in the sense that its excess risk converges to zero.
arXiv Detail & Related papers (2020-06-10T14:47:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.