Lorentzian Graph Convolutional Networks
- URL: http://arxiv.org/abs/2104.07477v1
- Date: Thu, 15 Apr 2021 14:14:25 GMT
- Title: Lorentzian Graph Convolutional Networks
- Authors: Yiding Zhang, Xiao Wang, Chuan Shi, Nian Liu, Guojie Song
- Abstract summary: We propose a novel hyperbolic graph convolutional network (LGCN) named Lorentzian graph convolutional network (LGCN)
LGCN rigorously guarantees the learned node features follow the hyperbolic geometry.
Experiments on six datasets show that LGCN performs better than the state-of-the-art methods.
- Score: 47.41609636856708
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph convolutional networks (GCNs) have received considerable research
attention recently. Most GCNs learn the node representations in Euclidean
geometry, but that could have a high distortion in the case of embedding graphs
with scale-free or hierarchical structure. Recently, some GCNs are proposed to
deal with this problem in non-Euclidean geometry, e.g., hyperbolic geometry.
Although hyperbolic GCNs achieve promising performance, existing hyperbolic
graph operations actually cannot rigorously follow the hyperbolic geometry,
which may limit the ability of hyperbolic geometry and thus hurt the
performance of hyperbolic GCNs. In this paper, we propose a novel hyperbolic
GCN named Lorentzian graph convolutional network (LGCN), which rigorously
guarantees the learned node features follow the hyperbolic geometry.
Specifically, we rebuild the graph operations of hyperbolic GCNs with
Lorentzian version, e.g., the feature transformation and non-linear activation.
Also, an elegant neighborhood aggregation method is designed based on the
centroid of Lorentzian distance. Moreover, we prove some proposed graph
operations are equivalent in different types of hyperbolic geometry, which
fundamentally indicates their correctness. Experiments on six datasets show
that LGCN performs better than the state-of-the-art methods. LGCN has lower
distortion to learn the representation of tree-likeness graphs compared with
existing hyperbolic GCNs. We also find that the performance of some hyperbolic
GCNs can be improved by simply replacing the graph operations with those we
defined in this paper.
Related papers
- Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - Hyperbolic Heterogeneous Graph Attention Networks [3.0165549581582454]
Most previous heterogeneous graph embedding models represent elements in a heterogeneous graph as vector representations in a low-dimensional Euclidean space.
We propose Hyperbolic Heterogeneous Graph Attention Networks (HHGAT) that learn vector representations in hyperbolic spaces with meta-path instances.
We conducted experiments on three real-world heterogeneous graph datasets, demonstrating that HHGAT outperforms state-of-the-art heterogeneous graph embedding models in node classification and clustering tasks.
arXiv Detail & Related papers (2024-04-15T04:45:49Z) - L^2GC:Lorentzian Linear Graph Convolutional Networks for Node Classification [12.69417276887153]
We propose a novel framework for Lorentzian linear GCN.
We map the learned features of graph nodes into hyperbolic space.
We then perform a Lorentzian linear feature transformation to capture the underlying tree-like structure of data.
arXiv Detail & Related papers (2024-03-10T02:16:13Z) - HyLa: Hyperbolic Laplacian Features For Graph Learning [44.33054069927441]
hyperbolic space can support embeddings of tree- and graph-structured data.
For graph learning, points in hyperbolic space have been used successfully as signals in deep neural networks.
Existing hyperbolic networks are computationally expensive and can be numerically unstable.
We propose HyLa, a completely different approach to using hyperbolic space in graph learning.
arXiv Detail & Related papers (2022-02-14T16:40:24Z) - Simplified Graph Convolution with Heterophily [25.7577503312319]
We show that Simple Graph Convolution (SGC) is ineffective for heterophilous (i.e., non-homophilous) graphs.
We propose Adaptive Simple Graph Convolution (ASGC), which we show can adapt to both homophilous and heterophilous graph structure.
arXiv Detail & Related papers (2022-02-08T20:52:08Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - A Hyperbolic-to-Hyperbolic Graph Convolutional Network [46.80564170208473]
We propose a hyperbolic-to-hyperbolic graph convolutional network (H2H-GCN) that directly works on hyperbolic manifold.
The H2H-GCN achieves substantial improvements on the link prediction, node classification, and graph classification tasks.
arXiv Detail & Related papers (2021-04-14T16:09:27Z) - Dissecting the Diffusion Process in Linear Graph Convolutional Networks [71.30132908130581]
Graph Convolutional Networks (GCNs) have attracted more and more attention in recent years.
Recent works show that a linear GCN can achieve comparable performance to the original non-linear GCN.
We propose Decoupled Graph Convolution (DGC) that decouples the terminal time and the feature propagation steps.
arXiv Detail & Related papers (2021-02-22T02:45:59Z) - Graph Highway Networks [77.38665506495553]
Graph Convolution Networks (GCN) are widely used in learning graph representations due to their effectiveness and efficiency.
They suffer from the notorious over-smoothing problem, in which the learned representations converge to alike vectors when many layers are stacked.
We propose Graph Highway Networks (GHNet) which utilize gating units to balance the trade-off between homogeneity and heterogeneity in the GCN learning process.
arXiv Detail & Related papers (2020-04-09T16:26:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.