Node-Specific Space Selection via Localized Geometric Hyperbolicity in
Graph Neural Networks
- URL: http://arxiv.org/abs/2303.01724v1
- Date: Fri, 3 Mar 2023 06:04:42 GMT
- Title: Node-Specific Space Selection via Localized Geometric Hyperbolicity in
Graph Neural Networks
- Authors: See Hian Lee, Feng Ji and Wee Peng Tay
- Abstract summary: Many graph neural networks have been developed to learn graph representations in either Euclidean or hyperbolic space.
In this paper, we analyze two notions of local hyperbolicity, describing the underlying local geometry.
We show that our model Joint Space Graph Neural Network (JSGNN) can leverage both Euclidean and hyperbolic spaces during learning.
- Score: 38.7842803074593
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Many graph neural networks have been developed to learn graph representations
in either Euclidean or hyperbolic space, with all nodes' representations
embedded in a single space. However, a graph can have hyperbolic and Euclidean
geometries at different regions of the graph. Thus, it is sub-optimal to
indifferently embed an entire graph into a single space. In this paper, we
explore and analyze two notions of local hyperbolicity, describing the
underlying local geometry: geometric (Gromov) and model-based, to determine the
preferred space of embedding for each node. The two hyperbolicities'
distributions are aligned using the Wasserstein metric such that the calculated
geometric hyperbolicity guides the choice of the learned model hyperbolicity.
As such our model Joint Space Graph Neural Network (JSGNN) can leverage both
Euclidean and hyperbolic spaces during learning by allowing node-specific
geometry space selection. We evaluate our model on both node classification and
link prediction tasks and observe promising performance compared to baseline
models.
Related papers
- Weighted Embeddings for Low-Dimensional Graph Representation [0.13499500088995461]
We propose embedding a graph into a weighted space, which is closely related to hyperbolic geometry but mathematically simpler.
We show that our weighted embeddings heavily outperform state-of-the-art Euclidean embeddings for heterogeneous graphs while using fewer dimensions.
arXiv Detail & Related papers (2024-10-08T13:41:03Z) - Modeling Graphs Beyond Hyperbolic: Graph Neural Networks in Symmetric
Positive Definite Matrices [8.805129821507046]
Real-world graph data is characterized by multiple types of geometric and topological features.
We construct graph neural networks that can robustly handle complex graphs.
arXiv Detail & Related papers (2023-06-24T21:50:53Z) - Hyperbolic Graph Representation Learning: A Tutorial [39.25873010585029]
This tutorial aims to give an introduction to this emerging field of graph representation learning with the express purpose of being accessible to all audiences.
We first give a brief introduction to graph representation learning as well as some preliminaryian and hyperbolic geometry.
We then comprehensively revisit the technical details of the current hyperbolic graph neural networks by unifying them into a general framework.
arXiv Detail & Related papers (2022-11-08T07:15:29Z) - Unveiling the Sampling Density in Non-Uniform Geometric Graphs [69.93864101024639]
We consider graphs as geometric graphs: nodes are randomly sampled from an underlying metric space, and any pair of nodes is connected if their distance is less than a specified neighborhood radius.
In a social network communities can be modeled as densely sampled areas, and hubs as nodes with larger neighborhood radius.
We develop methods to estimate the unknown sampling density in a self-supervised fashion.
arXiv Detail & Related papers (2022-10-15T08:01:08Z) - A Unification Framework for Euclidean and Hyperbolic Graph Neural
Networks [8.080621697426997]
Hyperbolic neural networks can effectively capture the inherent hierarchy of graph datasets.
They entangle multiple incongruent (gyro-)vector spaces within a layer, which makes them limited in terms of generalization and scalability.
We propose the Poincare disk model as our search space, and apply all approximations on the disk.
We demonstrate that our model not only leverages the power of Euclidean networks such as interpretability and efficient execution of various model components, but also outperforms both Euclidean and hyperbolic counterparts on various benchmarks.
arXiv Detail & Related papers (2022-06-09T05:33:02Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Geometric Graph Representation Learning via Maximizing Rate Reduction [73.6044873825311]
Learning node representations benefits various downstream tasks in graph analysis such as community detection and node classification.
We propose Geometric Graph Representation Learning (G2R) to learn node representations in an unsupervised manner.
G2R maps nodes in distinct groups into different subspaces, while each subspace is compact and different subspaces are dispersed.
arXiv Detail & Related papers (2022-02-13T07:46:24Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z) - Graph Geometry Interaction Learning [41.10468385822182]
We develop a novel Geometry Interaction Learning (GIL) method for graphs, a well-suited and efficient alternative for learning abundant geometric properties in graph.
Our method endows each node the freedom to determine the importance of each geometry space via a flexible dual feature interaction learning and probability assembling mechanism.
Promising experimental results are presented for five benchmark datasets on node classification and link prediction tasks.
arXiv Detail & Related papers (2020-10-23T02:40:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.