Unit Ball Model for Hierarchical Embeddings in Complex Hyperbolic Space
- URL: http://arxiv.org/abs/2105.03966v1
- Date: Sun, 9 May 2021 16:09:54 GMT
- Title: Unit Ball Model for Hierarchical Embeddings in Complex Hyperbolic Space
- Authors: Huiru Xiao, Caigao Jiang, Yangqiu Song, James Zhang, Junwu Xiong
- Abstract summary: Learning the representation of data with hierarchical structures in the hyperbolic space attracts increasing attention in recent years.
We propose to learn the graph embeddings in the unit ball model of the complex hyperbolic space.
- Score: 28.349200177632852
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learning the representation of data with hierarchical structures in the
hyperbolic space attracts increasing attention in recent years. Due to the
constant negative curvature, the hyperbolic space resembles tree metrics and
captures the tree-like properties of hierarchical graphs naturally, which
enables the hyperbolic embeddings to improve over traditional Euclidean models.
However, most graph data, even the data with hierarchical structures are not
trees and they usually do not ubiquitously match the constant curvature
property of the hyperbolic space. To address this limitation of hyperbolic
embeddings, we explore the complex hyperbolic space, which has the variable
negative curvature, for representation learning. Specifically, we propose to
learn the graph embeddings in the unit ball model of the complex hyperbolic
space. The unit ball model based embeddings have a more powerful representation
capacity to capture a variety of hierarchical graph structures. Through
experiments on synthetic and real-world data, we show that our approach
improves over the hyperbolic embedding models significantly.
Related papers
- Shedding Light on Problems with Hyperbolic Graph Learning [2.3743504594834635]
Recent papers in the graph machine learning literature have introduced a number of approaches for hyperbolic representation learning.
We take a careful look at the field of hyperbolic graph representation learning as it stands today.
We find that a number of papers fail to diligently present baselines, make faulty modelling assumptions when constructing algorithms, and use misleading metrics to quantify geometry of graph datasets.
arXiv Detail & Related papers (2024-11-11T03:12:41Z) - Weighted Embeddings for Low-Dimensional Graph Representation [0.13499500088995461]
We propose embedding a graph into a weighted space, which is closely related to hyperbolic geometry but mathematically simpler.
We show that our weighted embeddings heavily outperform state-of-the-art Euclidean embeddings for heterogeneous graphs while using fewer dimensions.
arXiv Detail & Related papers (2024-10-08T13:41:03Z) - Hyperbolic Heterogeneous Graph Attention Networks [3.0165549581582454]
Most previous heterogeneous graph embedding models represent elements in a heterogeneous graph as vector representations in a low-dimensional Euclidean space.
We propose Hyperbolic Heterogeneous Graph Attention Networks (HHGAT) that learn vector representations in hyperbolic spaces with meta-path instances.
We conducted experiments on three real-world heterogeneous graph datasets, demonstrating that HHGAT outperforms state-of-the-art heterogeneous graph embedding models in node classification and clustering tasks.
arXiv Detail & Related papers (2024-04-15T04:45:49Z) - Hyperbolic Delaunay Geometric Alignment [52.835250875177756]
We propose a similarity score for comparing datasets in a hyperbolic space.
The core idea is counting the edges of the hyperbolic Delaunay graph connecting datapoints across the given sets.
We provide an empirical investigation on synthetic and real-life biological data and demonstrate that HyperDGA outperforms the hyperbolic version of classical distances between sets.
arXiv Detail & Related papers (2024-04-12T17:14:58Z) - Alignment and Outer Shell Isotropy for Hyperbolic Graph Contrastive
Learning [69.6810940330906]
We propose a novel contrastive learning framework to learn high-quality graph embedding.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity which are related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - Tight and fast generalization error bound of graph embedding in metric
space [54.279425319381374]
We show that graph embedding in non-Euclidean metric space can outperform that in Euclidean space with much smaller training data than the existing bound has suggested.
Our new upper bound is significantly tighter and faster than the existing one, which can be exponential to $R$ and $O(frac1S)$ at the fastest.
arXiv Detail & Related papers (2023-05-13T17:29:18Z) - Hyperbolic Graph Representation Learning: A Tutorial [39.25873010585029]
This tutorial aims to give an introduction to this emerging field of graph representation learning with the express purpose of being accessible to all audiences.
We first give a brief introduction to graph representation learning as well as some preliminaryian and hyperbolic geometry.
We then comprehensively revisit the technical details of the current hyperbolic graph neural networks by unifying them into a general framework.
arXiv Detail & Related papers (2022-11-08T07:15:29Z) - Geometry Interaction Knowledge Graph Embeddings [153.69745042757066]
We propose Geometry Interaction knowledge graph Embeddings (GIE), which learns spatial structures interactively between the Euclidean, hyperbolic and hyperspherical spaces.
Our proposed GIE can capture a richer set of relational information, model key inference patterns, and enable expressive semantic matching across entities.
arXiv Detail & Related papers (2022-06-24T08:33:43Z) - HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization [52.369435664689995]
We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
arXiv Detail & Related papers (2022-04-18T06:11:44Z) - Enhancing Hyperbolic Graph Embeddings via Contrastive Learning [7.901082408569372]
We propose a novel Hyperbolic Graph Contrastive Learning (HGCL) framework which learns node representations through multiple hyperbolic spaces.
Experimental results on multiple real-world datasets demonstrate the superiority of the proposed HGCL.
arXiv Detail & Related papers (2022-01-21T06:10:05Z) - Hyperbolic Graph Embedding with Enhanced Semi-Implicit Variational
Inference [48.63194907060615]
We build off of semi-implicit graph variational auto-encoders to capture higher-order statistics in a low-dimensional graph latent representation.
We incorporate hyperbolic geometry in the latent space through a Poincare embedding to efficiently represent graphs exhibiting hierarchical structure.
arXiv Detail & Related papers (2020-10-31T05:48:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.