Generalization Error Bound for Hyperbolic Ordinal Embedding
- URL: http://arxiv.org/abs/2105.10475v1
- Date: Fri, 21 May 2021 17:31:08 GMT
- Title: Generalization Error Bound for Hyperbolic Ordinal Embedding
- Authors: Atsushi Suzuki, Atsushi Nitanda, Jing Wang, Linchuan Xu, Marc Cavazza,
Kenji Yamanishi
- Abstract summary: Hyperbolic ordinal embedding (HOE) represents entities as points in hyperbolic space.
We provide a generalization error bound of HOE for the first time, which is at most exponential with respect to the embedding space's radius.
Our comparison between the bounds of HOE and Euclidean ordinal embedding shows that HOE's generalization error is reasonable as a cost for its exponential representation ability.
- Score: 21.320308755965748
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hyperbolic ordinal embedding (HOE) represents entities as points in
hyperbolic space so that they agree as well as possible with given constraints
in the form of entity i is more similar to entity j than to entity k. It has
been experimentally shown that HOE can obtain representations of hierarchical
data such as a knowledge base and a citation network effectively, owing to
hyperbolic space's exponential growth property. However, its theoretical
analysis has been limited to ideal noiseless settings, and its generalization
error in compensation for hyperbolic space's exponential representation ability
has not been guaranteed. The difficulty is that existing generalization error
bound derivations for ordinal embedding based on the Gramian matrix do not work
in HOE, since hyperbolic space is not inner-product space. In this paper,
through our novel characterization of HOE with decomposed Lorentz Gramian
matrices, we provide a generalization error bound of HOE for the first time,
which is at most exponential with respect to the embedding space's radius. Our
comparison between the bounds of HOE and Euclidean ordinal embedding shows that
HOE's generalization error is reasonable as a cost for its exponential
representation ability.
Related papers
- Understanding and Mitigating Hyperbolic Dimensional Collapse in Graph Contrastive Learning [70.0681902472251]
We propose a novel contrastive learning framework to learn high-quality graph embeddings in hyperbolic space.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - Hyperbolic vs Euclidean Embeddings in Few-Shot Learning: Two Sides of
the Same Coin [49.12496652756007]
We show that the best few-shot results are attained for hyperbolic embeddings at a common hyperbolic radius.
In contrast to prior benchmark results, we demonstrate that better performance can be achieved by a fixed-radius encoder equipped with the Euclidean metric.
arXiv Detail & Related papers (2023-09-18T14:51:46Z) - Tight and fast generalization error bound of graph embedding in metric
space [54.279425319381374]
We show that graph embedding in non-Euclidean metric space can outperform that in Euclidean space with much smaller training data than the existing bound has suggested.
Our new upper bound is significantly tighter and faster than the existing one, which can be exponential to $R$ and $O(frac1S)$ at the fastest.
arXiv Detail & Related papers (2023-05-13T17:29:18Z) - FFHR: Fully and Flexible Hyperbolic Representation for Knowledge Graph
Completion [45.470475498688344]
Some important operations in hyperbolic space still lack good definitions, making existing methods unable to fully leverage the merits of hyperbolic space.
We develop a textbfFully and textbfFlexible textbfHyperbolic textbfRepresentation framework (textbfFFHR) that is able to transfer recent Euclidean-based advances to hyperbolic space.
arXiv Detail & Related papers (2023-02-07T14:50:28Z) - Non-Isometric Quantum Error Correction in Gravity [0.0]
We construct and study an ensemble of non-isometric error correcting codes in a toy model of an evaporating black hole in dilaton gravity.
We show that the typical such code is very likely to preserve pairwise inner products in a set $S$ of states that can be subexponentially large in the microcanonical Hilbert space dimension of the black hole.
arXiv Detail & Related papers (2022-10-24T18:00:00Z) - Continuous percolation in a Hilbert space for a large system of qubits [58.720142291102135]
The percolation transition is defined through the appearance of the infinite cluster.
We show that the exponentially increasing dimensionality of the Hilbert space makes its covering by finite-size hyperspheres inefficient.
Our approach to the percolation transition in compact metric spaces may prove useful for its rigorous treatment in other contexts.
arXiv Detail & Related papers (2022-10-15T13:53:21Z) - HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization [52.369435664689995]
We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
arXiv Detail & Related papers (2022-04-18T06:11:44Z) - Nested Hyperbolic Spaces for Dimensionality Reduction and Hyperbolic NN
Design [8.250374560598493]
Hyperbolic neural networks have been popular in the recent past due to their ability to represent hierarchical data sets effectively and efficiently.
The challenge in developing these networks lies in the nonlinearity of the embedding space namely, the Hyperbolic space.
We present a novel fully hyperbolic neural network which uses the concept of projections (embeddings) followed by an intrinsic aggregation and a nonlinearity all within the hyperbolic space.
arXiv Detail & Related papers (2021-12-03T03:20:27Z) - CO-SNE: Dimensionality Reduction and Visualization for Hyperbolic Data [10.618060176686916]
We propose CO-SNE, extending the Euclidean space visualization tool, t-SNE, to hyperbolic space.
Unlike Euclidean space, hyperbolic space is inhomogeneous: a volume could contain a lot more points at a location far from the origin.
We apply CO-SNE to high-dimensional hyperbolic biological data as well as unsupervisedly learned hyperbolic representations.
arXiv Detail & Related papers (2021-11-30T00:21:47Z) - Robust Large-Margin Learning in Hyperbolic Space [64.42251583239347]
We present the first theoretical guarantees for learning a classifier in hyperbolic rather than Euclidean space.
We provide an algorithm to efficiently learn a large-margin hyperplane, relying on the careful injection of adversarial examples.
We prove that for hierarchical data that embeds well into hyperbolic space, the low embedding dimension ensures superior guarantees.
arXiv Detail & Related papers (2020-04-11T19:11:30Z) - Quantum Geometric Confinement and Dynamical Transmission in Grushin
Cylinder [68.8204255655161]
We classify the self-adjoint realisations of the Laplace-Beltrami operator minimally defined on an infinite cylinder.
We retrieve those distinguished extensions previously identified in the recent literature, namely the most confining and the most transmitting.
arXiv Detail & Related papers (2020-03-16T11:37:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.