Descent Steps of a Relation-Aware Energy Produce Heterogeneous Graph
Neural Networks
- URL: http://arxiv.org/abs/2206.11081v2
- Date: Fri, 24 Jun 2022 09:21:01 GMT
- Title: Descent Steps of a Relation-Aware Energy Produce Heterogeneous Graph
Neural Networks
- Authors: Hongjoon Ahn, Yongyi Yang, Quan Gan, David Wipf and Taesup Moon
- Abstract summary: Heterogeneous graph neural networks (GNNs) achieve strong performance on node classification tasks in a semi-supervised learning setting.
We propose a novel heterogeneous GNN architecture in which layers are derived from optimization steps that descend a novel relation-aware energy function.
- Score: 25.59092732148598
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Heterogeneous graph neural networks (GNNs) achieve strong performance on node
classification tasks in a semi-supervised learning setting. However, as in the
simpler homogeneous GNN case, message-passing-based heterogeneous GNNs may
struggle to balance between resisting the oversmoothing occuring in deep models
and capturing long-range dependencies graph structured data. Moreover, the
complexity of this trade-off is compounded in the heterogeneous graph case due
to the disparate heterophily relationships between nodes of different types. To
address these issues, we proposed a novel heterogeneous GNN architecture in
which layers are derived from optimization steps that descend a novel
relation-aware energy function. The corresponding minimizer is fully
differentiable with respect to the energy function parameters, such that
bilevel optimization can be applied to effectively learn a functional form
whose minimum provides optimal node representations for subsequent
classification tasks. In particular, this methodology allows us to model
diverse heterophily relationships between different node types while avoiding
oversmoothing effects. Experimental results on 8 heterogeneous graph benchmarks
demonstrates that our proposed method can achieve competitive node
classification accuracy.
Related papers
- A GAN Approach for Node Embedding in Heterogeneous Graphs Using Subgraph Sampling [33.50085646298074]
We propose a novel framework that combines Graph Neural Network (GNN) and Generative Adrial Network (GAN) to enhance classification for underrepresented node classes.
The framework incorporates an advanced edge generation and selection module, enabling the simultaneous creation of synthetic nodes and edges.
arXiv Detail & Related papers (2023-12-11T16:52:20Z) - Histopathology Whole Slide Image Analysis with Heterogeneous Graph
Representation Learning [78.49090351193269]
We propose a novel graph-based framework to leverage the inter-relationships among different types of nuclei for WSI analysis.
Specifically, we formulate the WSI as a heterogeneous graph with "nucleus-type" attribute to each node and a semantic attribute similarity to each edge.
Our framework outperforms the state-of-the-art methods with considerable margins on various tasks.
arXiv Detail & Related papers (2023-07-09T14:43:40Z) - Demystifying Structural Disparity in Graph Neural Networks: Can One Size
Fit All? [61.35457647107439]
Most real-world homophilic and heterophilic graphs are comprised of a mixture of nodes in both homophilic and heterophilic structural patterns.
We provide evidence that Graph Neural Networks(GNNs) on node classification typically perform admirably on homophilic nodes.
We then propose a rigorous, non-i.i.d PAC-Bayesian generalization bound for GNNs, revealing reasons for the performance disparity.
arXiv Detail & Related papers (2023-06-02T07:46:20Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - Uniting Heterogeneity, Inductiveness, and Efficiency for Graph
Representation Learning [68.97378785686723]
graph neural networks (GNNs) have greatly advanced the performance of node representation learning on graphs.
A majority class of GNNs are only designed for homogeneous graphs, leading to inferior adaptivity to the more informative heterogeneous graphs.
We propose a novel inductive, meta path-free message passing scheme that packs up heterogeneous node features with their associated edges from both low- and high-order neighbor nodes.
arXiv Detail & Related papers (2021-04-04T23:31:39Z) - Meta-path Free Semi-supervised Learning for Heterogeneous Networks [16.641434334366227]
Graph neural networks (GNNs) have been widely used in representation learning on graphs and achieved superior performance in tasks such as node classification.
In this paper, we propose simple and effective graph neural networks for heterogeneous graph, excluding the use of meta-paths.
arXiv Detail & Related papers (2020-10-18T06:01:58Z) - Graph Neural Networks with Heterophily [40.23690407583509]
We propose a novel framework called CPGNN that generalizes GNNs for graphs with either homophily or heterophily.
We show that replacing the compatibility matrix in our framework with the identity (which represents pure homophily) reduces to GCN.
arXiv Detail & Related papers (2020-09-28T18:29:36Z) - Heterogeneous Graph Transformer [49.675064816860505]
Heterogeneous Graph Transformer (HGT) architecture for modeling Web-scale heterogeneous graphs.
To handle dynamic heterogeneous graphs, we introduce the relative temporal encoding technique into HGT.
To handle Web-scale graph data, we design the heterogeneous mini-batch graph sampling algorithm---HGSampling---for efficient and scalable training.
arXiv Detail & Related papers (2020-03-03T04:49:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.