Matrix factorisation and the interpretation of geodesic distance
- URL: http://arxiv.org/abs/2106.01260v1
- Date: Wed, 2 Jun 2021 16:11:33 GMT
- Title: Matrix factorisation and the interpretation of geodesic distance
- Authors: Nick Whiteley, Annie Gray and Patrick Rubin-Delanchy
- Abstract summary: Given a graph or similarity matrix, we consider the problem of recovering a notion of true distance between the nodes.
We show that this can be accomplished in two steps: matrix factorisation, followed by nonlinear dimension reduction.
- Score: 6.445605125467574
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Given a graph or similarity matrix, we consider the problem of recovering a
notion of true distance between the nodes, and so their true positions. Through
new insights into the manifold geometry underlying a generic latent position
model, we show that this can be accomplished in two steps: matrix
factorisation, followed by nonlinear dimension reduction. This combination is
effective because the point cloud obtained in the first step lives close to a
manifold in which latent distance is encoded as geodesic distance. Hence, a
nonlinear dimension reduction tool, approximating geodesic distance, can
recover the latent positions, up to a simple transformation. We give a detailed
account of the case where spectral embedding is used, followed by Isomap, and
provide encouraging experimental evidence for other combinations of techniques.
Related papers
- Sample-Efficient Geometry Reconstruction from Euclidean Distances using Non-Convex Optimization [7.114174944371803]
The problem of finding suitable point embedding Euclidean distance information point pairs arises both as a core task and as a sub-machine learning learning problem.
In this paper, we aim to solve this problem given a minimal number of samples.
arXiv Detail & Related papers (2024-10-22T13:02:12Z) - Reconstructing the Geometry of Random Geometric Graphs [9.004991291124096]
Random geometric graphs are random graph models defined on metric spaces.
We show how to efficiently reconstruct the geometry of the underlying space from the sampled graph.
arXiv Detail & Related papers (2024-02-14T21:34:44Z) - Gradient Descent Provably Solves Nonlinear Tomographic Reconstruction [60.95625458395291]
In computed tomography (CT) the forward model consists of a linear transform followed by an exponential nonlinearity based on the attenuation of light according to the Beer-Lambert Law.
We show that this approach reduces metal artifacts compared to a commercial reconstruction of a human skull with metal crowns.
arXiv Detail & Related papers (2023-10-06T00:47:57Z) - Vanishing Point Estimation in Uncalibrated Images with Prior Gravity
Direction [82.72686460985297]
We tackle the problem of estimating a Manhattan frame.
We derive two new 2-line solvers, one of which does not suffer from singularities affecting existing solvers.
We also design a new non-minimal method, running on an arbitrary number of lines, to boost the performance in local optimization.
arXiv Detail & Related papers (2023-08-21T13:03:25Z) - Short and Straight: Geodesics on Differentiable Manifolds [6.85316573653194]
In this work, we first analyse existing methods for computing length-minimising geodesics.
Second, we propose a model-based parameterisation for distance fields and geodesic flows on continuous manifold.
Third, we develop a curvature-based training mechanism, sampling and scaling points in regions of the manifold exhibiting larger values of the Ricci scalar.
arXiv Detail & Related papers (2023-05-24T15:09:41Z) - Shape And Structure Preserving Differential Privacy [70.08490462870144]
We show how the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
We also show how using the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
arXiv Detail & Related papers (2022-09-21T18:14:38Z) - Lifting the Convex Conjugate in Lagrangian Relaxations: A Tractable
Approach for Continuous Markov Random Fields [53.31927549039624]
We show that a piecewise discretization preserves better contrast from existing discretization problems.
We apply this theory to the problem of matching two images.
arXiv Detail & Related papers (2021-07-13T12:31:06Z) - Improving Metric Dimensionality Reduction with Distributed Topology [68.8204255655161]
DIPOLE is a dimensionality-reduction post-processing step that corrects an initial embedding by minimizing a loss functional with both a local, metric term and a global, topological term.
We observe that DIPOLE outperforms popular methods like UMAP, t-SNE, and Isomap on a number of popular datasets.
arXiv Detail & Related papers (2021-06-14T17:19:44Z) - Manifold Learning via Manifold Deflation [105.7418091051558]
dimensionality reduction methods provide a valuable means to visualize and interpret high-dimensional data.
Many popular methods can fail dramatically, even on simple two-dimensional Manifolds.
This paper presents an embedding method for a novel, incremental tangent space estimator that incorporates global structure as coordinates.
Empirically, we show our algorithm recovers novel and interesting embeddings on real-world and synthetic datasets.
arXiv Detail & Related papers (2020-07-07T10:04:28Z) - Rehabilitating Isomap: Euclidean Representation of Geodesic Structure [0.0]
We argue that Isomap is better understood as constructing Euclidean representations of geodesic structure.
We revisit the rationale for Isomap, clarifying what Isomap does and what it does not.
arXiv Detail & Related papers (2020-06-18T21:21:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.