Heterogeneous Graph Tree Networks
- URL: http://arxiv.org/abs/2209.00610v1
- Date: Thu, 1 Sep 2022 17:22:01 GMT
- Title: Heterogeneous Graph Tree Networks
- Authors: Nan Wu, Chaofan Wang
- Abstract summary: Heterogeneous graph neural networks (HGNNs) have attracted increasing research interest in recent three years.
One class is meta-path-based HGNNs which either require domain knowledge to handcraft meta-paths or consume huge amount of time and memory to automatically construct meta-paths.
We propose two models: Heterogeneous Graph Tree Convolutional Network (HetGTCN) and Heterogeneous Graph Tree Attention Network (HetGTAN)
- Score: 8.50892442127182
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Heterogeneous graph neural networks (HGNNs) have attracted increasing
research interest in recent three years. Most existing HGNNs fall into two
classes. One class is meta-path-based HGNNs which either require domain
knowledge to handcraft meta-paths or consume huge amount of time and memory to
automatically construct meta-paths. The other class does not rely on meta-path
construction. It takes homogeneous convolutional graph neural networks
(Conv-GNNs) as backbones and extend them to heterogeneous graphs by introducing
node-type- and edge-type-dependent parameters. Regardless of the meta-path
dependency, most existing HGNNs employ shallow Conv-GNNs such as GCN and GAT to
aggregate neighborhood information, and may have limited capability to capture
information from high-order neighborhood. In this work, we propose two
heterogeneous graph tree network models: Heterogeneous Graph Tree Convolutional
Network (HetGTCN) and Heterogeneous Graph Tree Attention Network (HetGTAN),
which do not rely on meta-paths to encode heterogeneity in both node features
and graph structure. Extensive experiments on three real-world heterogeneous
graph data demonstrate that the proposed HetGTCN and HetGTAN are efficient and
consistently outperform all state-of-the-art HGNN baselines on semi-supervised
node classification tasks, and can go deep without compromising performance.
Related papers
- HAGNN: Hybrid Aggregation for Heterogeneous Graph Neural Networks [15.22198175691658]
Heterogeneous graph neural networks (GNNs) have been successful in handling heterogeneous graphs.
Recent work pointed out that simple homogeneous graph model without meta-path can also achieve comparable results.
We propose a novel framework to utilize the rich type semantic information in heterogeneous graphs comprehensively, namely HAGNN.
arXiv Detail & Related papers (2023-07-04T10:40:20Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - 2-hop Neighbor Class Similarity (2NCS): A graph structural metric
indicative of graph neural network performance [4.051099980410583]
Graph Neural Networks (GNNs) achieve state-of-the-art performance on graph-structured data across numerous domains.
On heterophilous graphs, in which different-type nodes are likely connected, GNNs perform less consistently.
We introduce 2-hop Neighbor Class Similarity (2NCS), a new quantitative graph structural property that correlates with GNN performance more strongly and consistently than alternative metrics.
arXiv Detail & Related papers (2022-12-26T16:16:51Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Incorporating Heterophily into Graph Neural Networks for Graph Classification [6.709862924279403]
Graph Neural Networks (GNNs) often assume strong homophily for graph classification, seldom considering heterophily.
We develop a novel GNN architecture called IHGNN (short for Incorporating Heterophily into Graph Neural Networks)
We empirically validate IHGNN on various graph datasets and demonstrate that it outperforms the state-of-the-art GNNs for graph classification.
arXiv Detail & Related papers (2022-03-15T06:48:35Z) - Graph Neural Networks for Graphs with Heterophily: A Survey [98.45621222357397]
We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
arXiv Detail & Related papers (2022-02-14T23:07:47Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - Graph Highway Networks [77.38665506495553]
Graph Convolution Networks (GCN) are widely used in learning graph representations due to their effectiveness and efficiency.
They suffer from the notorious over-smoothing problem, in which the learned representations converge to alike vectors when many layers are stacked.
We propose Graph Highway Networks (GHNet) which utilize gating units to balance the trade-off between homogeneity and heterogeneity in the GCN learning process.
arXiv Detail & Related papers (2020-04-09T16:26:43Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.