Edge Directionality Improves Learning on Heterophilic Graphs
- URL: http://arxiv.org/abs/2305.10498v3
- Date: Tue, 28 Nov 2023 18:33:37 GMT
- Title: Edge Directionality Improves Learning on Heterophilic Graphs
- Authors: Emanuele Rossi, Bertrand Charpentier, Francesco Di Giovanni, Fabrizio
Frasca, Stephan G\"unnemann, Michael Bronstein
- Abstract summary: We introduce Directed Graph Neural Network (Dir-GNN), a novel framework for deep learning on directed graphs.
Dir-GNN can be used to extend any Message Passing Neural Network (MPNN) to account for edge directionality information.
We prove that Dir-GNN matches the expressivity of the Directed Weisfeiler-Lehman test, exceeding that of conventional MPNNs.
- Score: 42.5099159786891
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Neural Networks (GNNs) have become the de-facto standard tool for
modeling relational data. However, while many real-world graphs are directed,
the majority of today's GNN models discard this information altogether by
simply making the graph undirected. The reasons for this are historical: 1)
many early variants of spectral GNNs explicitly required undirected graphs, and
2) the first benchmarks on homophilic graphs did not find significant gain from
using direction. In this paper, we show that in heterophilic settings, treating
the graph as directed increases the effective homophily of the graph,
suggesting a potential gain from the correct use of directionality information.
To this end, we introduce Directed Graph Neural Network (Dir-GNN), a novel
general framework for deep learning on directed graphs. Dir-GNN can be used to
extend any Message Passing Neural Network (MPNN) to account for edge
directionality information by performing separate aggregations of the incoming
and outgoing edges. We prove that Dir-GNN matches the expressivity of the
Directed Weisfeiler-Lehman test, exceeding that of conventional MPNNs. In
extensive experiments, we validate that while our framework leaves performance
unchanged on homophilic datasets, it leads to large gains over base models such
as GCN, GAT and GraphSage on heterophilic benchmarks, outperforming much more
complex methods and achieving new state-of-the-art results.
Related papers
- Design Your Own Universe: A Physics-Informed Agnostic Method for Enhancing Graph Neural Networks [34.16727363891593]
We propose a model-agnostic enhancement framework for Graph Neural Networks (GNNs)
This framework enriches the graph structure by introducing additional nodes and rewiring connections with both positive and negative weights.
We theoretically verify that GNNs enhanced through our approach can effectively circumvent the over-smoothing issue and exhibit robustness against over-squashing.
Empirical validations on benchmarks for homophilic, heterophilic graphs, and long-term graph datasets show that GNNs enhanced by our method significantly outperform their original counterparts.
arXiv Detail & Related papers (2024-01-26T00:47:43Z) - Learning to Reweight for Graph Neural Network [63.978102332612906]
Graph Neural Networks (GNNs) show promising results for graph tasks.
Existing GNNs' generalization ability will degrade when there exist distribution shifts between testing and training graph data.
We propose a novel nonlinear graph decorrelation method, which can substantially improve the out-of-distribution generalization ability.
arXiv Detail & Related papers (2023-12-19T12:25:10Z) - Breaking the Entanglement of Homophily and Heterophily in
Semi-supervised Node Classification [25.831508778029097]
We introduce AMUD, which quantifies the relationship between node profiles and topology from a statistical perspective.
We also propose ADPA as a new directed graph learning paradigm for AMUD.
arXiv Detail & Related papers (2023-12-07T07:54:11Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Make Heterophily Graphs Better Fit GNN: A Graph Rewiring Approach [43.41163711340362]
We propose a method named Deep Heterophily Graph Rewiring (DHGR) to rewire graphs by adding homophilic edges and pruning heterophilic edges.
To the best of our knowledge, it is the first work studying graph rewiring for heterophily graphs.
arXiv Detail & Related papers (2022-09-17T06:55:21Z) - MentorGNN: Deriving Curriculum for Pre-Training GNNs [61.97574489259085]
We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs.
We shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs.
arXiv Detail & Related papers (2022-08-21T15:12:08Z) - ES-GNN: Generalizing Graph Neural Networks Beyond Homophily with Edge Splitting [32.69196871253339]
We propose a novel Edge Splitting GNN (ES-GNN) framework to adaptively distinguish between graph edges either relevant or irrelevant to learning tasks.
We show that our ES-GNN can be regarded as a solution to a disentangled graph denoising problem.
arXiv Detail & Related papers (2022-05-27T01:29:03Z) - Graph Neural Networks for Graphs with Heterophily: A Survey [98.45621222357397]
We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
arXiv Detail & Related papers (2022-02-14T23:07:47Z) - GPT-GNN: Generative Pre-Training of Graph Neural Networks [93.35945182085948]
Graph neural networks (GNNs) have been demonstrated to be powerful in modeling graph-structured data.
We present the GPT-GNN framework to initialize GNNs by generative pre-training.
We show that GPT-GNN significantly outperforms state-of-the-art GNN models without pre-training by up to 9.1% across various downstream tasks.
arXiv Detail & Related papers (2020-06-27T20:12:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.