ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network
- URL: http://arxiv.org/abs/2110.07888v1
- Date: Fri, 15 Oct 2021 07:18:57 GMT
- Title: ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network
- Authors: Xingcheng Fu, Jianxin Li, Jia Wu, Qingyun Sun, Cheng Ji, Senzhang
Wang, Jiajun Tan, Hao Peng and Philip S. Yu
- Abstract summary: We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
- Score: 72.16255675586089
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have been widely studied in various graph data
mining tasks. Most existingGNNs embed graph data into Euclidean space and thus
are less effective to capture the ubiquitous hierarchical structures in
real-world networks. Hyperbolic Graph Neural Networks(HGNNs) extend GNNs to
hyperbolic space and thus are more effective to capture the hierarchical
structures of graphs in node representation learning. In hyperbolic geometry,
the graph hierarchical structure can be reflected by the curvatures of the
hyperbolic space, and different curvatures can model different hierarchical
structures of a graph. However, most existing HGNNs manually set the curvature
to a fixed value for simplicity, which achieves a suboptimal performance of
graph learning due to the complex and diverse hierarchical structures of the
graphs. To resolve this problem, we propose an Adaptive Curvature Exploration
Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal
curvature according to the input graph and downstream tasks. Specifically,
ACE-HGNN exploits a multi-agent reinforcement learning framework and contains
two agents, ACE-Agent andHGNN-Agent for learning the curvature and node
representations, respectively. The two agents are updated by a NashQ-leaning
algorithm collaboratively, seeking the optimal hyperbolic space indexed by the
curvature. Extensive experiments on multiple real-world graph datasets
demonstrate a significant and consistent performance improvement in model
quality with competitive performance and good generalization ability.
Related papers
- GRE^2-MDCL: Graph Representation Embedding Enhanced via Multidimensional Contrastive Learning [0.0]
Graph representation learning has emerged as a powerful tool for preserving graph topology when mapping nodes to vector representations.
Current graph neural network models face the challenge of requiring extensive labeled data.
We propose Graph Representation Embedding Enhanced via Multidimensional Contrastive Learning.
arXiv Detail & Related papers (2024-09-12T03:09:05Z) - Spectral Greedy Coresets for Graph Neural Networks [61.24300262316091]
The ubiquity of large-scale graphs in node-classification tasks hinders the real-world applications of Graph Neural Networks (GNNs)
This paper studies graph coresets for GNNs and avoids the interdependence issue by selecting ego-graphs based on their spectral embeddings.
Our spectral greedy graph coreset (SGGC) scales to graphs with millions of nodes, obviates the need for model pre-training, and applies to low-homophily graphs.
arXiv Detail & Related papers (2024-05-27T17:52:12Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Implicit Graph Neural Diffusion Networks: Convergence, Generalization,
and Over-Smoothing [7.984586585987328]
Implicit Graph Neural Networks (GNNs) have achieved significant success in addressing graph learning problems.
We introduce a geometric framework for designing implicit graph diffusion layers based on a parameterized graph Laplacian operator.
We show how implicit GNN layers can be viewed as the fixed-point equation of a Dirichlet energy minimization problem.
arXiv Detail & Related papers (2023-08-07T05:22:33Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Topological Graph Neural Networks [14.349152231293928]
We present TOGL, a novel layer that incorporates global topological information of a graph using persistent homology.
Augmenting GNNs with our layer leads to beneficial predictive performance, both on synthetic data sets and on real-world data.
arXiv Detail & Related papers (2021-02-15T20:27:56Z) - Hierarchical Message-Passing Graph Neural Networks [12.207978823927386]
We propose a novel Hierarchical Message-passing Graph Neural Networks framework.
Key idea is generating a hierarchical structure that re-organises all nodes in a flat graph into multi-level super graphs.
We present the first model to implement this framework, termed Hierarchical Community-aware Graph Neural Network (HC-GNN)
arXiv Detail & Related papers (2020-09-08T13:11:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.