Enhancing Hyperbolic Graph Embeddings via Contrastive Learning
- URL: http://arxiv.org/abs/2201.08554v1
- Date: Fri, 21 Jan 2022 06:10:05 GMT
- Title: Enhancing Hyperbolic Graph Embeddings via Contrastive Learning
- Authors: Jiahong Liu, Menglin Yang, Min Zhou, Shanshan Feng, Philippe
Fournier-Viger
- Abstract summary: We propose a novel Hyperbolic Graph Contrastive Learning (HGCL) framework which learns node representations through multiple hyperbolic spaces.
Experimental results on multiple real-world datasets demonstrate the superiority of the proposed HGCL.
- Score: 7.901082408569372
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, hyperbolic space has risen as a promising alternative for
semi-supervised graph representation learning. Many efforts have been made to
design hyperbolic versions of neural network operations. However, the inspiring
geometric properties of this unique geometry have not been fully explored yet.
The potency of graph models powered by the hyperbolic space is still largely
underestimated. Besides, the rich information carried by abundant unlabelled
samples is also not well utilized. Inspired by the recently active and emerging
self-supervised learning, in this study, we attempt to enhance the
representation power of hyperbolic graph models by drawing upon the advantages
of contrastive learning. More specifically, we put forward a novel Hyperbolic
Graph Contrastive Learning (HGCL) framework which learns node representations
through multiple hyperbolic spaces to implicitly capture the hierarchical
structure shared between different views. Then, we design a hyperbolic position
consistency (HPC) constraint based on hyperbolic distance and the homophily
assumption to make contrastive learning fit into hyperbolic space. Experimental
results on multiple real-world datasets demonstrate the superiority of the
proposed HGCL as it consistently outperforms competing methods by considerable
margins for the node classification task.
Related papers
- Weighted Embeddings for Low-Dimensional Graph Representation [0.13499500088995461]
We propose embedding a graph into a weighted space, which is closely related to hyperbolic geometry but mathematically simpler.
We show that our weighted embeddings heavily outperform state-of-the-art Euclidean embeddings for heterogeneous graphs while using fewer dimensions.
arXiv Detail & Related papers (2024-10-08T13:41:03Z) - Hyperbolic Delaunay Geometric Alignment [52.835250875177756]
We propose a similarity score for comparing datasets in a hyperbolic space.
The core idea is counting the edges of the hyperbolic Delaunay graph connecting datapoints across the given sets.
We provide an empirical investigation on synthetic and real-life biological data and demonstrate that HyperDGA outperforms the hyperbolic version of classical distances between sets.
arXiv Detail & Related papers (2024-04-12T17:14:58Z) - Alignment and Outer Shell Isotropy for Hyperbolic Graph Contrastive
Learning [69.6810940330906]
We propose a novel contrastive learning framework to learn high-quality graph embedding.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity which are related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - Hyperbolic vs Euclidean Embeddings in Few-Shot Learning: Two Sides of
the Same Coin [49.12496652756007]
We show that the best few-shot results are attained for hyperbolic embeddings at a common hyperbolic radius.
In contrast to prior benchmark results, we demonstrate that better performance can be achieved by a fixed-radius encoder equipped with the Euclidean metric.
arXiv Detail & Related papers (2023-09-18T14:51:46Z) - Hyperbolic Graph Representation Learning: A Tutorial [39.25873010585029]
This tutorial aims to give an introduction to this emerging field of graph representation learning with the express purpose of being accessible to all audiences.
We first give a brief introduction to graph representation learning as well as some preliminaryian and hyperbolic geometry.
We then comprehensively revisit the technical details of the current hyperbolic graph neural networks by unifying them into a general framework.
arXiv Detail & Related papers (2022-11-08T07:15:29Z) - Geometry Contrastive Learning on Heterogeneous Graphs [50.58523799455101]
This paper proposes a novel self-supervised learning method, termed as Geometry Contrastive Learning (GCL)
GCL views a heterogeneous graph from Euclidean and hyperbolic perspective simultaneously, aiming to make a strong merger of the ability of modeling rich semantics and complex structures.
Extensive experiments on four benchmarks data sets show that the proposed approach outperforms the strong baselines.
arXiv Detail & Related papers (2022-06-25T03:54:53Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - HyLa: Hyperbolic Laplacian Features For Graph Learning [44.33054069927441]
hyperbolic space can support embeddings of tree- and graph-structured data.
For graph learning, points in hyperbolic space have been used successfully as signals in deep neural networks.
Existing hyperbolic networks are computationally expensive and can be numerically unstable.
We propose HyLa, a completely different approach to using hyperbolic space in graph learning.
arXiv Detail & Related papers (2022-02-14T16:40:24Z) - Spatial-spectral Hyperspectral Image Classification via Multiple Random
Anchor Graphs Ensemble Learning [88.60285937702304]
This paper proposes a novel spatial-spectral HSI classification method via multiple random anchor graphs ensemble learning (RAGE)
Firstly, the local binary pattern is adopted to extract the more descriptive features on each selected band, which preserves local structures and subtle changes of a region.
Secondly, the adaptive neighbors assignment is introduced in the construction of anchor graph, to reduce the computational complexity.
arXiv Detail & Related papers (2021-03-25T09:31:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.