A Hyperbolic-to-Hyperbolic Graph Convolutional Network
- URL: http://arxiv.org/abs/2104.06942v1
- Date: Wed, 14 Apr 2021 16:09:27 GMT
- Title: A Hyperbolic-to-Hyperbolic Graph Convolutional Network
- Authors: Jindou Dai, Yuwei Wu, Zhi Gao, and Yunde Jia
- Abstract summary: We propose a hyperbolic-to-hyperbolic graph convolutional network (H2H-GCN) that directly works on hyperbolic manifold.
The H2H-GCN achieves substantial improvements on the link prediction, node classification, and graph classification tasks.
- Score: 46.80564170208473
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hyperbolic graph convolutional networks (GCNs) demonstrate powerful
representation ability to model graphs with hierarchical structure. Existing
hyperbolic GCNs resort to tangent spaces to realize graph convolution on
hyperbolic manifolds, which is inferior because tangent space is only a local
approximation of a manifold. In this paper, we propose a
hyperbolic-to-hyperbolic graph convolutional network (H2H-GCN) that directly
works on hyperbolic manifolds. Specifically, we developed a manifold-preserving
graph convolution that consists of a hyperbolic feature transformation and a
hyperbolic neighborhood aggregation. The hyperbolic feature transformation
works as linear transformation on hyperbolic manifolds. It ensures the
transformed node representations still lie on the hyperbolic manifold by
imposing the orthogonal constraint on the transformation sub-matrix. The
hyperbolic neighborhood aggregation updates each node representation via the
Einstein midpoint. The H2H-GCN avoids the distortion caused by tangent space
approximations and keeps the global hyperbolic structure. Extensive experiments
show that the H2H-GCN achieves substantial improvements on the link prediction,
node classification, and graph classification tasks.
Related papers
- Hyperbolic Delaunay Geometric Alignment [52.835250875177756]
We propose a similarity score for comparing datasets in a hyperbolic space.
The core idea is counting the edges of the hyperbolic Delaunay graph connecting datapoints across the given sets.
We provide an empirical investigation on synthetic and real-life biological data and demonstrate that HyperDGA outperforms the hyperbolic version of classical distances between sets.
arXiv Detail & Related papers (2024-04-12T17:14:58Z) - Hyperbolic Convolution via Kernel Point Aggregation [4.061135251278187]
We propose HKConv, a novel trainable hyperbolic convolution which first correlates trainable local hyperbolic features with fixed kernel points placed in the hyperbolic space.
We show that neural networks with HKConv layers advance state-of-the-art in various tasks.
arXiv Detail & Related papers (2023-06-15T05:15:13Z) - FFHR: Fully and Flexible Hyperbolic Representation for Knowledge Graph
Completion [45.470475498688344]
Some important operations in hyperbolic space still lack good definitions, making existing methods unable to fully leverage the merits of hyperbolic space.
We develop a textbfFully and textbfFlexible textbfHyperbolic textbfRepresentation framework (textbfFFHR) that is able to transfer recent Euclidean-based advances to hyperbolic space.
arXiv Detail & Related papers (2023-02-07T14:50:28Z) - HyLa: Hyperbolic Laplacian Features For Graph Learning [44.33054069927441]
hyperbolic space can support embeddings of tree- and graph-structured data.
For graph learning, points in hyperbolic space have been used successfully as signals in deep neural networks.
Existing hyperbolic networks are computationally expensive and can be numerically unstable.
We propose HyLa, a completely different approach to using hyperbolic space in graph learning.
arXiv Detail & Related papers (2022-02-14T16:40:24Z) - Enhancing Hyperbolic Graph Embeddings via Contrastive Learning [7.901082408569372]
We propose a novel Hyperbolic Graph Contrastive Learning (HGCL) framework which learns node representations through multiple hyperbolic spaces.
Experimental results on multiple real-world datasets demonstrate the superiority of the proposed HGCL.
arXiv Detail & Related papers (2022-01-21T06:10:05Z) - Fully Hyperbolic Neural Networks [63.22521652077353]
We propose a fully hyperbolic framework to build hyperbolic networks based on the Lorentz model.
We show that our method has better performance for building both shallow and deep networks.
arXiv Detail & Related papers (2021-05-31T03:36:49Z) - Lorentzian Graph Convolutional Networks [47.41609636856708]
We propose a novel hyperbolic graph convolutional network (LGCN) named Lorentzian graph convolutional network (LGCN)
LGCN rigorously guarantees the learned node features follow the hyperbolic geometry.
Experiments on six datasets show that LGCN performs better than the state-of-the-art methods.
arXiv Detail & Related papers (2021-04-15T14:14:25Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.