A dual basis approach to multidimensional scaling
- URL: http://arxiv.org/abs/2303.05682v2
- Date: Tue, 30 Jul 2024 21:03:15 GMT
- Title: A dual basis approach to multidimensional scaling
- Authors: Samuel Lichtenberg, Abiy Tasissa,
- Abstract summary: CMDS is a technique that embeds objects in a Euclidean space given their pairwise Euclidean distances.
We give an explicit formula for the dual basis vectors and fully characterize the spectrum of an essential matrix in the dual basis framework.
- Score: 3.069335774032178
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Classical multidimensional scaling (CMDS) is a technique that embeds a set of objects in a Euclidean space given their pairwise Euclidean distances. The main part of CMDS involves double centering a squared distance matrix and using a truncated eigendecomposition to recover the point coordinates. In this paper, motivated by a study in Euclidean distance geometry, we explore a dual basis approach to CMDS. We give an explicit formula for the dual basis vectors and fully characterize the spectrum of an essential matrix in the dual basis framework. We make connections to a related problem in metric nearness.
Related papers
- Enforcing Latent Euclidean Geometry in Single-Cell VAEs for Manifold Interpolation [79.27003481818413]
We introduce FlatVI, a training framework that regularises the latent manifold of discrete-likelihood variational autoencoders towards Euclidean geometry.<n>By encouraging straight lines in the latent space to approximate geodesics on the decoded single-cell manifold, FlatVI enhances compatibility with downstream approaches.
arXiv Detail & Related papers (2025-07-15T23:08:14Z) - A Dual Basis Approach for Structured Robust Euclidean Distance Geometry [6.422262171968397]
This paper considers the setting where only a set of anchor nodes is used to collect the distances between themselves and the rest.<n>In the presence of potential outliers, it results in a structured partial observation on EDM with partial corruptions.<n>We propose a novel algorithmic framework, dubbed Robust Euclidean Distance Geometry via Dual Basis (RoDEoDB) for recovering the Euclidean distance geometry.
arXiv Detail & Related papers (2025-05-23T22:40:21Z) - Manifold learning in metric spaces [4.849550522970841]
Laplacian-based methods are popular for dimensionality reduction of data lying in $mathbbRN$.
We provide a framework that generalizes the problem of manifold learning to metric spaces and study when a metric satisfies sufficient conditions for the pointwise convergence of the graph Laplacian.
arXiv Detail & Related papers (2025-03-20T14:37:40Z) - RMLR: Extending Multinomial Logistic Regression into General Geometries [64.16104856124029]
Our framework only requires minimal geometric properties, thus exhibiting broad applicability.
We develop five families of SPD MLRs under five types of power-deformed metrics.
On rotation matrices we propose Lie MLR based on the popular bi-invariant metric.
arXiv Detail & Related papers (2024-09-28T18:38:21Z) - Product Geometries on Cholesky Manifolds with Applications to SPD Manifolds [65.04845593770727]
We present two new metrics on the Symmetric Positive Definite (SPD) manifold via the Cholesky manifold.
Our metrics are easy to use, computationally efficient, and numerically stable.
arXiv Detail & Related papers (2024-07-02T18:46:13Z) - Synergistic eigenanalysis of covariance and Hessian matrices for enhanced binary classification [72.77513633290056]
We present a novel approach that combines the eigenanalysis of a covariance matrix evaluated on a training set with a Hessian matrix evaluated on a deep learning model.
Our method captures intricate patterns and relationships, enhancing classification performance.
arXiv Detail & Related papers (2024-02-14T16:10:42Z) - The Fisher-Rao geometry of CES distributions [50.50897590847961]
The Fisher-Rao information geometry allows for leveraging tools from differential geometry.
We will present some practical uses of these geometric tools in the framework of elliptical distributions.
arXiv Detail & Related papers (2023-10-02T09:23:32Z) - Improving Metric Dimensionality Reduction with Distributed Topology [68.8204255655161]
DIPOLE is a dimensionality-reduction post-processing step that corrects an initial embedding by minimizing a loss functional with both a local, metric term and a global, topological term.
We observe that DIPOLE outperforms popular methods like UMAP, t-SNE, and Isomap on a number of popular datasets.
arXiv Detail & Related papers (2021-06-14T17:19:44Z) - Matrix factorisation and the interpretation of geodesic distance [6.445605125467574]
Given a graph or similarity matrix, we consider the problem of recovering a notion of true distance between the nodes.
We show that this can be accomplished in two steps: matrix factorisation, followed by nonlinear dimension reduction.
arXiv Detail & Related papers (2021-06-02T16:11:33Z) - Manifold learning with arbitrary norms [8.433233101044197]
We show that manifold learning based on Earthmover's distances outperforms the standard Euclidean variant for learning molecular shape spaces.
We show in a numerical simulation that manifold learning based on Earthmover's distances outperforms the standard Euclidean variant for learning molecular shape spaces.
arXiv Detail & Related papers (2020-12-28T10:24:30Z) - Spectral Flow on the Manifold of SPD Matrices for Multimodal Data
Processing [17.162497914078322]
We consider data acquired by multimodal sensors capturing complementary aspects and features of a measured phenomenon.
We focus on a scenario in which the measurements share mutual sources of variability but might also be contaminated by other measurement-specific sources.
arXiv Detail & Related papers (2020-09-17T04:38:57Z) - Manifold Learning via Manifold Deflation [105.7418091051558]
dimensionality reduction methods provide a valuable means to visualize and interpret high-dimensional data.
Many popular methods can fail dramatically, even on simple two-dimensional Manifolds.
This paper presents an embedding method for a novel, incremental tangent space estimator that incorporates global structure as coordinates.
Empirically, we show our algorithm recovers novel and interesting embeddings on real-world and synthetic datasets.
arXiv Detail & Related papers (2020-07-07T10:04:28Z) - Learning Flat Latent Manifolds with VAEs [16.725880610265378]
We propose an extension to the framework of variational auto-encoders, where the Euclidean metric is a proxy for the similarity between data points.
We replace the compact prior typically used in variational auto-encoders with a recently presented, more expressive hierarchical one.
We evaluate our method on a range of data-sets, including a video-tracking benchmark.
arXiv Detail & Related papers (2020-02-12T09:54:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.