Hyperbolic Graph Representation Learning: A Tutorial
- URL: http://arxiv.org/abs/2211.04050v1
- Date: Tue, 8 Nov 2022 07:15:29 GMT
- Title: Hyperbolic Graph Representation Learning: A Tutorial
- Authors: Min Zhou, Menglin Yang, Lujia Pan, Irwin King
- Abstract summary: This tutorial aims to give an introduction to this emerging field of graph representation learning with the express purpose of being accessible to all audiences.
We first give a brief introduction to graph representation learning as well as some preliminaryian and hyperbolic geometry.
We then comprehensively revisit the technical details of the current hyperbolic graph neural networks by unifying them into a general framework.
- Score: 39.25873010585029
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Graph-structured data are widespread in real-world applications, such as
social networks, recommender systems, knowledge graphs, chemical molecules etc.
Despite the success of Euclidean space for graph-related learning tasks, its
ability to model complex patterns is essentially constrained by its
polynomially growing capacity. Recently, hyperbolic spaces have emerged as a
promising alternative for processing graph data with tree-like structure or
power-law distribution, owing to the exponential growth property. Different
from Euclidean space, which expands polynomially, the hyperbolic space grows
exponentially which makes it gains natural advantages in abstracting tree-like
or scale-free graphs with hierarchical organizations.
In this tutorial, we aim to give an introduction to this emerging field of
graph representation learning with the express purpose of being accessible to
all audiences. We first give a brief introduction to graph representation
learning as well as some preliminary Riemannian and hyperbolic geometry. We
then comprehensively revisit the hyperbolic embedding techniques, including
hyperbolic shallow models and hyperbolic neural networks. In addition, we
introduce the technical details of the current hyperbolic graph neural networks
by unifying them into a general framework and summarizing the variants of each
component. Moreover, we further introduce a series of related applications in a
variety of fields. In the last part, we discuss several advanced topics about
hyperbolic geometry for graph representation learning, which potentially serve
as guidelines for further flourishing the non-Euclidean graph learning
community.
Related papers
- Weighted Embeddings for Low-Dimensional Graph Representation [0.13499500088995461]
We propose embedding a graph into a weighted space, which is closely related to hyperbolic geometry but mathematically simpler.
We show that our weighted embeddings heavily outperform state-of-the-art Euclidean embeddings for heterogeneous graphs while using fewer dimensions.
arXiv Detail & Related papers (2024-10-08T13:41:03Z) - Everything is Connected: Graph Neural Networks [0.0]
This short survey aims to enable the reader to assimilate the key concepts in the area of graph representation learning.
The main aim of this short survey is to enable the reader to assimilate the key concepts in the area, and position graph representation learning in a proper context with related fields.
arXiv Detail & Related papers (2023-01-19T18:09:43Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Enhancing Hyperbolic Graph Embeddings via Contrastive Learning [7.901082408569372]
We propose a novel Hyperbolic Graph Contrastive Learning (HGCL) framework which learns node representations through multiple hyperbolic spaces.
Experimental results on multiple real-world datasets demonstrate the superiority of the proposed HGCL.
arXiv Detail & Related papers (2022-01-21T06:10:05Z) - Unit Ball Model for Hierarchical Embeddings in Complex Hyperbolic Space [28.349200177632852]
Learning the representation of data with hierarchical structures in the hyperbolic space attracts increasing attention in recent years.
We propose to learn the graph embeddings in the unit ball model of the complex hyperbolic space.
arXiv Detail & Related papers (2021-05-09T16:09:54Z) - GraphOpt: Learning Optimization Models of Graph Formation [72.75384705298303]
We propose an end-to-end framework that learns an implicit model of graph structure formation and discovers an underlying optimization mechanism.
The learned objective can serve as an explanation for the observed graph properties, thereby lending itself to transfer across different graphs within a domain.
GraphOpt poses link formation in graphs as a sequential decision-making process and solves it using maximum entropy inverse reinforcement learning algorithm.
arXiv Detail & Related papers (2020-07-07T16:51:39Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z) - Deep Learning for Learning Graph Representations [58.649784596090385]
Mining graph data has become a popular research topic in computer science.
The huge amount of network data has posed great challenges for efficient analysis.
This motivates the advent of graph representation which maps the graph into a low-dimension vector space.
arXiv Detail & Related papers (2020-01-02T02:13:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.