HetTree: Heterogeneous Tree Graph Neural Network
- URL: http://arxiv.org/abs/2402.13496v1
- Date: Wed, 21 Feb 2024 03:14:45 GMT
- Title: HetTree: Heterogeneous Tree Graph Neural Network
- Authors: Mingyu Guan, Jack W. Stokes, Qinlong Luo, Fuchen Liu, Purvanshi Mehta,
Elnaz Nouri, Taesoo Kim
- Abstract summary: HetTree is a novel heterogeneous tree graph neural network that models both the graph structure and heterogeneous aspects.
HetTree builds a semantic tree data structure to capture the hierarchy among metapaths.
Our evaluation of HetTree on a variety of real-world datasets demonstrates that it outperforms all existing baselines on open benchmarks.
- Score: 12.403166161903378
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The recent past has seen an increasing interest in Heterogeneous Graph Neural
Networks (HGNNs) since many real-world graphs are heterogeneous in nature, from
citation graphs to email graphs. However, existing methods ignore a tree
hierarchy among metapaths, which is naturally constituted by different node
types and relation types. In this paper, we present HetTree, a novel
heterogeneous tree graph neural network that models both the graph structure
and heterogeneous aspects in a scalable and effective manner. Specifically,
HetTree builds a semantic tree data structure to capture the hierarchy among
metapaths. Existing tree encoding techniques aggregate children nodes by
weighting the contribution of children nodes based on similarity to the parent
node. However, we find that this tree encoding fails to capture the entire
parent-children hierarchy by only considering the parent node. Hence, HetTree
uses a novel subtree attention mechanism to emphasize metapaths that are more
helpful in encoding parent-children relationships. Moreover, instead of
separating feature learning from label learning or treating features and labels
equally by projecting them to the same latent space, HetTree proposes to match
them carefully based on corresponding metapaths, which provides more accurate
and richer information between node features and labels. Our evaluation of
HetTree on a variety of real-world datasets demonstrates that it outperforms
all existing baselines on open benchmarks and efficiently scales to large
real-world graphs with millions of nodes and edges.
Related papers
- NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - GTNet: A Tree-Based Deep Graph Learning Architecture [8.50892442127182]
We propose a deep graph learning architecture with a new general message passing scheme that originates from the tree representation of graphs.
Two graph representation learning models are proposed within this GTNet architecture - Graph Tree Attention Network (GTAN) and Graph Tree Convolution Network (GTCN)
arXiv Detail & Related papers (2022-04-27T09:43:14Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Graph Tree Neural Networks [0.43012765978447565]
Graph tree neural networks (GTNNs) are designed to solve the problems of existing networks by analyzing the structure of human neural networks.
In GTNNs, information units are related to the form of a graph and then they become a bigger unit of information again and have a relationship with other information units.
arXiv Detail & Related papers (2021-10-31T07:58:00Z) - TD-GEN: Graph Generation With Tree Decomposition [31.751200416677225]
TD-GEN is a graph generation framework based on tree decomposition.
Tree nodes are supernodes, each representing a cluster of nodes in the graph.
arXiv Detail & Related papers (2021-06-20T08:57:43Z) - Neural Trees for Learning on Graphs [19.05038106825347]
Graph Neural Networks (GNNs) have emerged as a flexible and powerful approach for learning over graphs.
We propose a new GNN architecture -- the Neural Tree.
We show that the neural tree architecture can approximate any smooth probability distribution function over an undirected graph.
arXiv Detail & Related papers (2021-05-15T17:08:20Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - Visualizing hierarchies in scRNA-seq data using a density tree-biased
autoencoder [50.591267188664666]
We propose an approach for identifying a meaningful tree structure from high-dimensional scRNA-seq data.
We then introduce DTAE, a tree-biased autoencoder that emphasizes the tree structure of the data in low dimensional space.
arXiv Detail & Related papers (2021-02-11T08:48:48Z) - Tree Structure-Aware Graph Representation Learning via Integrated
Hierarchical Aggregation and Relational Metric Learning [26.8738194817491]
We propose T-GNN, a tree structure-aware graph neural network model for graph representation learning.
The proposed T-GNN consists of two modules: (1) the integrated hierarchical aggregation module and (2) the relational metric learning module.
arXiv Detail & Related papers (2020-08-23T09:41:19Z) - Graph Neural Networks with Composite Kernels [60.81504431653264]
We re-interpret node aggregation from the perspective of kernel weighting.
We present a framework to consider feature similarity in an aggregation scheme.
We propose feature aggregation as the composition of the original neighbor-based kernel and a learnable kernel to encode feature similarities in a feature space.
arXiv Detail & Related papers (2020-05-16T04:44:29Z) - Graph Inference Learning for Semi-supervised Classification [50.55765399527556]
We propose a Graph Inference Learning framework to boost the performance of semi-supervised node classification.
For learning the inference process, we introduce meta-optimization on structure relations from training nodes to validation nodes.
Comprehensive evaluations on four benchmark datasets demonstrate the superiority of our proposed GIL when compared against state-of-the-art methods.
arXiv Detail & Related papers (2020-01-17T02:52:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.