Hyperbolic Distance Matrices
- URL: http://arxiv.org/abs/2005.08672v2
- Date: Fri, 11 Sep 2020 17:02:32 GMT
- Title: Hyperbolic Distance Matrices
- Authors: Puoya Tabaghi, Ivan Dokmani\'c
- Abstract summary: We propose a unified framework to compute hyperbolic embeddings from an arbitrary mix of noisy metric and non-metric data.
Our algorithms are based on semidefinite programming and the notion of a hyperbolic distance matrix.
We show through numerical experiments how the flexibility to mix metric and non-metric constraints allows us to efficiently compute embeddings from arbitrary data.
- Score: 31.90021739879016
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hyperbolic space is a natural setting for mining and visualizing data with
hierarchical structure. In order to compute a hyperbolic embedding from
comparison or similarity information, one has to solve a hyperbolic distance
geometry problem. In this paper, we propose a unified framework to compute
hyperbolic embeddings from an arbitrary mix of noisy metric and non-metric
data. Our algorithms are based on semidefinite programming and the notion of a
hyperbolic distance matrix, in many ways parallel to its famous Euclidean
counterpart. A central ingredient we put forward is a semidefinite
characterization of the hyperbolic Gramian -- a matrix of Lorentzian inner
products. This characterization allows us to formulate a semidefinite
relaxation to efficiently compute hyperbolic embeddings in two stages: first,
we complete and denoise the observed hyperbolic distance matrix; second, we
propose a spectral factorization method to estimate the embedded points from
the hyperbolic distance matrix. We show through numerical experiments how the
flexibility to mix metric and non-metric constraints allows us to efficiently
compute embeddings from arbitrary data.
Related papers
- A Set-to-Set Distance Measure in Hyperbolic Space [50.134086375286074]
We propose a hyperbolic set-to-set distance measure for computing dissimilarity between sets in hyperbolic space.<n>By considering the topological differences, HS2SD provides a more nuanced understanding of the relationships between two hyperbolic sets.
arXiv Detail & Related papers (2025-06-23T11:31:40Z) - A group-theoretic framework for machine learning in hyperbolic spaces [0.0]
This paper introduces the notion of the mean (barycenter) and the novel family of probability distributions on hyperbolic balls.
We propose efficient optimization algorithms for computation of the barycenter and for maximum likelihood estimation.
One can build upon basic concepts presented here in order to design more demanding algorithms and implement hyperbolic deep learning pipelines.
arXiv Detail & Related papers (2025-01-12T21:06:38Z) - Hyperbolic Delaunay Geometric Alignment [52.835250875177756]
We propose a similarity score for comparing datasets in a hyperbolic space.
The core idea is counting the edges of the hyperbolic Delaunay graph connecting datapoints across the given sets.
We provide an empirical investigation on synthetic and real-life biological data and demonstrate that HyperDGA outperforms the hyperbolic version of classical distances between sets.
arXiv Detail & Related papers (2024-04-12T17:14:58Z) - Accelerating hyperbolic t-SNE [7.411478341945197]
This paper introduces the first acceleration structure for hyperbolic embeddings, building upon a polar quadtree.
We demonstrate that it computes embeddings of similar quality in significantly less time.
arXiv Detail & Related papers (2024-01-23T12:59:40Z) - Alignment and Outer Shell Isotropy for Hyperbolic Graph Contrastive
Learning [69.6810940330906]
We propose a novel contrastive learning framework to learn high-quality graph embedding.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity which are related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - Hyperbolic vs Euclidean Embeddings in Few-Shot Learning: Two Sides of
the Same Coin [49.12496652756007]
We show that the best few-shot results are attained for hyperbolic embeddings at a common hyperbolic radius.
In contrast to prior benchmark results, we demonstrate that better performance can be achieved by a fixed-radius encoder equipped with the Euclidean metric.
arXiv Detail & Related papers (2023-09-18T14:51:46Z) - HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization [52.369435664689995]
We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
arXiv Detail & Related papers (2022-04-18T06:11:44Z) - Hyperbolic Vision Transformers: Combining Improvements in Metric
Learning [116.13290702262248]
We propose a new hyperbolic-based model for metric learning.
At the core of our method is a vision transformer with output embeddings mapped to hyperbolic space.
We evaluate the proposed model with six different formulations on four datasets.
arXiv Detail & Related papers (2022-03-21T09:48:23Z) - Provably Accurate and Scalable Linear Classifiers in Hyperbolic Spaces [39.71927912296049]
We propose a unified framework for learning scalable and simple hyperbolic linear classifiers.
The gist of our approach is to focus on Poincar'e ball models and formulate the classification problems using tangent space formalisms.
The excellent performance of the Poincar'e second-order and strategic perceptrons shows that the proposed framework can be extended to general machine learning problems in hyperbolic spaces.
arXiv Detail & Related papers (2022-03-07T21:36:21Z) - Highly Scalable and Provably Accurate Classification in Poincare Balls [40.82908295137667]
We establish a unified framework for learning scalable and simple hyperbolic linear classifiers with provable performance guarantees.
Our results include a new hyperbolic and second-order perceptron algorithm as well as an efficient and highly accurate convex optimization setup for hyperbolic support vector machine classifiers.
Their performance accuracies on synthetic data sets comprising millions of points, as well as on complex real-world data sets such as single-cell RNA-seq expression measurements, CIFAR10, Fashion-MNIST and mini-ImageNet.
arXiv Detail & Related papers (2021-09-08T16:59:39Z) - Semiparametric Nonlinear Bipartite Graph Representation Learning with
Provable Guarantees [106.91654068632882]
We consider the bipartite graph and formalize its representation learning problem as a statistical estimation problem of parameters in a semiparametric exponential family distribution.
We show that the proposed objective is strongly convex in a neighborhood around the ground truth, so that a gradient descent-based method achieves linear convergence rate.
Our estimator is robust to any model misspecification within the exponential family, which is validated in extensive experiments.
arXiv Detail & Related papers (2020-03-02T16:40:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.