Tree Structure-Aware Graph Representation Learning via Integrated
Hierarchical Aggregation and Relational Metric Learning
- URL: http://arxiv.org/abs/2008.10003v2
- Date: Mon, 21 Sep 2020 05:59:43 GMT
- Title: Tree Structure-Aware Graph Representation Learning via Integrated
Hierarchical Aggregation and Relational Metric Learning
- Authors: Ziyue Qiao, Pengyang Wang, Yanjie Fu, Yi Du, Pengfei Wang, Yuanchun
Zhou
- Abstract summary: We propose T-GNN, a tree structure-aware graph neural network model for graph representation learning.
The proposed T-GNN consists of two modules: (1) the integrated hierarchical aggregation module and (2) the relational metric learning module.
- Score: 26.8738194817491
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While Graph Neural Network (GNN) has shown superiority in learning node
representations of homogeneous graphs, leveraging GNN on heterogeneous graphs
remains a challenging problem. The dominating reason is that GNN learns node
representations by aggregating neighbors' information regardless of node types.
Some work is proposed to alleviate such issue by exploiting relations or
meta-path to sample neighbors with distinct categories, then use attention
mechanism to learn different importance for different categories. However, one
limitation is that the learned representations for different types of nodes
should own different feature spaces, while all the above work still project
node representations into one feature space. Moreover, after exploring massive
heterogeneous graphs, we identify a fact that multiple nodes with the same type
always connect to a node with another type, which reveals the many-to-one
schema, a.k.a. the hierarchical tree structure. But all the above work cannot
preserve such tree structure, since the exact multi-hop path correlation from
neighbors to the target node would be erased through aggregation. Therefore, to
overcome the limitations of the literature, we propose T-GNN, a tree
structure-aware graph neural network model for graph representation learning.
Specifically, the proposed T-GNN consists of two modules: (1) the integrated
hierarchical aggregation module and (2) the relational metric learning module.
The integrated hierarchical aggregation module aims to preserve the tree
structure by combining GNN with Gated Recurrent Unit to integrate the
hierarchical and sequential neighborhood information on the tree structure to
node representations. The relational metric learning module aims to preserve
the heterogeneity by embedding each type of nodes into a type-specific space
with distinct distribution based on similarity metrics.
Related papers
- Incorporating Heterophily into Graph Neural Networks for Graph Classification [6.709862924279403]
Graph Neural Networks (GNNs) often assume strong homophily for graph classification, seldom considering heterophily.
We develop a novel GNN architecture called IHGNN (short for Incorporating Heterophily into Graph Neural Networks)
We empirically validate IHGNN on various graph datasets and demonstrate that it outperforms the state-of-the-art GNNs for graph classification.
arXiv Detail & Related papers (2022-03-15T06:48:35Z) - A Variational Edge Partition Model for Supervised Graph Representation
Learning [51.30365677476971]
This paper introduces a graph generative process to model how the observed edges are generated by aggregating the node interactions over a set of overlapping node communities.
We partition each edge into the summation of multiple community-specific weighted edges and use them to define community-specific GNNs.
A variational inference framework is proposed to jointly learn a GNN based inference network that partitions the edges into different communities, these community-specific GNNs, and a GNN based predictor that combines community-specific GNNs for the end classification task.
arXiv Detail & Related papers (2022-02-07T14:37:50Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - HMSG: Heterogeneous Graph Neural Network based on Metapath Subgraph
Learning [2.096172374930129]
We propose a new heterogeneous graph neural network model named HMSG.
We decompose the heterogeneous graph into multiple subgraphs.
Each subgraph associates specific semantic and structural information.
Through a type-specific attribute transformation, node attributes can also be transferred among different types of nodes.
arXiv Detail & Related papers (2021-09-07T05:02:59Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - Reinforced Neighborhood Selection Guided Multi-Relational Graph Neural
Networks [68.9026534589483]
RioGNN is a novel Reinforced, recursive and flexible neighborhood selection guided multi-relational Graph Neural Network architecture.
RioGNN can learn more discriminative node embedding with enhanced explainability due to the recognition of individual importance of each relation.
arXiv Detail & Related papers (2021-04-16T04:30:06Z) - Graph Neural Networks with Composite Kernels [60.81504431653264]
We re-interpret node aggregation from the perspective of kernel weighting.
We present a framework to consider feature similarity in an aggregation scheme.
We propose feature aggregation as the composition of the original neighbor-based kernel and a learnable kernel to encode feature similarities in a feature space.
arXiv Detail & Related papers (2020-05-16T04:44:29Z) - MAGNN: Metapath Aggregated Graph Neural Network for Heterogeneous Graph
Embedding [36.6390478350677]
We propose a new model named Metapath Aggregated Graph Neural Network (MAGNN) to boost the final performance.
MAGNN employs three major components, i.e., the node content transformation to encapsulate input node attributes, the intra-metapath aggregation to incorporate intermediate semantic nodes, and the inter-metapath aggregation to combine messages from multiple metapaths.
Experiments show that MAGNN achieves more accurate prediction results than state-of-the-art baselines.
arXiv Detail & Related papers (2020-02-05T08:21:00Z) - Graph Inference Learning for Semi-supervised Classification [50.55765399527556]
We propose a Graph Inference Learning framework to boost the performance of semi-supervised node classification.
For learning the inference process, we introduce meta-optimization on structure relations from training nodes to validation nodes.
Comprehensive evaluations on four benchmark datasets demonstrate the superiority of our proposed GIL when compared against state-of-the-art methods.
arXiv Detail & Related papers (2020-01-17T02:52:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.