FFHR: Fully and Flexible Hyperbolic Representation for Knowledge Graph
Completion
- URL: http://arxiv.org/abs/2302.04088v1
- Date: Tue, 7 Feb 2023 14:50:28 GMT
- Title: FFHR: Fully and Flexible Hyperbolic Representation for Knowledge Graph
Completion
- Authors: Wentao Shi, Junkang Wu, Xuezhi Cao, Jiawei Chen, Wenqiang Lei, Wei Wu
and Xiangnan He
- Abstract summary: Some important operations in hyperbolic space still lack good definitions, making existing methods unable to fully leverage the merits of hyperbolic space.
We develop a textbfFully and textbfFlexible textbfHyperbolic textbfRepresentation framework (textbfFFHR) that is able to transfer recent Euclidean-based advances to hyperbolic space.
- Score: 45.470475498688344
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning hyperbolic embeddings for knowledge graph (KG) has gained increasing
attention due to its superiority in capturing hierarchies. However, some
important operations in hyperbolic space still lack good definitions, making
existing methods unable to fully leverage the merits of hyperbolic space.
Specifically, they suffer from two main limitations: 1) existing Graph
Convolutional Network (GCN) methods in hyperbolic space rely on tangent space
approximation, which would incur approximation error in representation
learning, and 2) due to the lack of inner product operation definition in
hyperbolic space, existing methods can only measure the plausibility of facts
(links) with hyperbolic distance, which is difficult to capture complex data
patterns. In this work, we contribute: 1) a Full Poincar\'{e} Multi-relational
GCN that achieves graph information propagation in hyperbolic space without
requiring any approximation, and 2) a hyperbolic generalization of Euclidean
inner product that is beneficial to capture both hierarchical and complex
patterns. On this basis, we further develop a \textbf{F}ully and
\textbf{F}lexible \textbf{H}yperbolic \textbf{R}epresentation framework
(\textbf{FFHR}) that is able to transfer recent Euclidean-based advances to
hyperbolic space. We demonstrate it by instantiating FFHR with four
representative KGC methods. Extensive experiments on benchmark datasets
validate the superiority of our FFHRs over their Euclidean counterparts as well
as state-of-the-art hyperbolic embedding methods.
Related papers
- Disentangled Hyperbolic Representation Learning for Heterogeneous Graphs [29.065531121422204]
We propose $textDis-H2textGCN$, a Disentangled Hyperbolic Heterogeneous Graph Convolutional Network.
We evaluate our proposed $textDis-H2textGCN$ on five real-world heterogeneous graph datasets.
arXiv Detail & Related papers (2024-06-14T18:50:47Z) - Hyperbolic vs Euclidean Embeddings in Few-Shot Learning: Two Sides of
the Same Coin [49.12496652756007]
We show that the best few-shot results are attained for hyperbolic embeddings at a common hyperbolic radius.
In contrast to prior benchmark results, we demonstrate that better performance can be achieved by a fixed-radius encoder equipped with the Euclidean metric.
arXiv Detail & Related papers (2023-09-18T14:51:46Z) - Tight and fast generalization error bound of graph embedding in metric
space [54.279425319381374]
We show that graph embedding in non-Euclidean metric space can outperform that in Euclidean space with much smaller training data than the existing bound has suggested.
Our new upper bound is significantly tighter and faster than the existing one, which can be exponential to $R$ and $O(frac1S)$ at the fastest.
arXiv Detail & Related papers (2023-05-13T17:29:18Z) - HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization [52.369435664689995]
We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
arXiv Detail & Related papers (2022-04-18T06:11:44Z) - Provably Accurate and Scalable Linear Classifiers in Hyperbolic Spaces [39.71927912296049]
We propose a unified framework for learning scalable and simple hyperbolic linear classifiers.
The gist of our approach is to focus on Poincar'e ball models and formulate the classification problems using tangent space formalisms.
The excellent performance of the Poincar'e second-order and strategic perceptrons shows that the proposed framework can be extended to general machine learning problems in hyperbolic spaces.
arXiv Detail & Related papers (2022-03-07T21:36:21Z) - HyLa: Hyperbolic Laplacian Features For Graph Learning [44.33054069927441]
hyperbolic space can support embeddings of tree- and graph-structured data.
For graph learning, points in hyperbolic space have been used successfully as signals in deep neural networks.
Existing hyperbolic networks are computationally expensive and can be numerically unstable.
We propose HyLa, a completely different approach to using hyperbolic space in graph learning.
arXiv Detail & Related papers (2022-02-14T16:40:24Z) - Enhancing Hyperbolic Graph Embeddings via Contrastive Learning [7.901082408569372]
We propose a novel Hyperbolic Graph Contrastive Learning (HGCL) framework which learns node representations through multiple hyperbolic spaces.
Experimental results on multiple real-world datasets demonstrate the superiority of the proposed HGCL.
arXiv Detail & Related papers (2022-01-21T06:10:05Z) - A Hyperbolic-to-Hyperbolic Graph Convolutional Network [46.80564170208473]
We propose a hyperbolic-to-hyperbolic graph convolutional network (H2H-GCN) that directly works on hyperbolic manifold.
The H2H-GCN achieves substantial improvements on the link prediction, node classification, and graph classification tasks.
arXiv Detail & Related papers (2021-04-14T16:09:27Z) - Differentiating through the Fr\'echet Mean [51.32291896926807]
Fr'echet mean is a generalization of the Euclidean mean.
We show how to differentiate through the Fr'echet mean for arbitrary Riemannian manifold.
This fully integrates the Fr'echet mean into the hyperbolic neural network pipeline.
arXiv Detail & Related papers (2020-02-29T19:49:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.