GraphPatcher: Mitigating Degree Bias for Graph Neural Networks via
Test-time Augmentation
- URL: http://arxiv.org/abs/2310.00800v1
- Date: Sun, 1 Oct 2023 21:50:03 GMT
- Title: GraphPatcher: Mitigating Degree Bias for Graph Neural Networks via
Test-time Augmentation
- Authors: Mingxuan Ju, Tong Zhao, Wenhao Yu, Neil Shah, Yanfang Ye
- Abstract summary: Graph neural networks (GNNs) usually perform satisfactorily on high-degree nodes with rich neighbor information but struggle with low-degree nodes.
We propose a test-time augmentation framework, namely GraphPatcher, to enhance test-time generalization of any GNNs on low-degree nodes.
GraphPatcher consistently enhances common GNNs' overall performance by up to 3.6% and low-degree performance by up to 6.5%, significantly outperforming state-of-the-art baselines.
- Score: 48.88356355021239
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent studies have shown that graph neural networks (GNNs) exhibit strong
biases towards the node degree: they usually perform satisfactorily on
high-degree nodes with rich neighbor information but struggle with low-degree
nodes. Existing works tackle this problem by deriving either designated GNN
architectures or training strategies specifically for low-degree nodes. Though
effective, these approaches unintentionally create an artificial
out-of-distribution scenario, where models mainly or even only observe
low-degree nodes during the training, leading to a downgraded performance for
high-degree nodes that GNNs originally perform well at. In light of this, we
propose a test-time augmentation framework, namely GraphPatcher, to enhance
test-time generalization of any GNNs on low-degree nodes. Specifically,
GraphPatcher iteratively generates virtual nodes to patch artificially created
low-degree nodes via corruptions, aiming at progressively reconstructing target
GNN's predictions over a sequence of increasingly corrupted nodes. Through this
scheme, GraphPatcher not only learns how to enhance low-degree nodes (when the
neighborhoods are heavily corrupted) but also preserves the original superior
performance of GNNs on high-degree nodes (when lightly corrupted).
Additionally, GraphPatcher is model-agnostic and can also mitigate the degree
bias for either self-supervised or supervised GNNs. Comprehensive experiments
are conducted over seven benchmark datasets and GraphPatcher consistently
enhances common GNNs' overall performance by up to 3.6% and low-degree
performance by up to 6.5%, significantly outperforming state-of-the-art
baselines. The source code is publicly available at
https://github.com/jumxglhf/GraphPatcher.
Related papers
- Mitigating Degree Bias in Signed Graph Neural Networks [5.042342963087923]
Signed Graph Neural Networks (SGNNs) are up against fairness issues from source data and typical aggregation method.
In this paper, we are pioneering to make the investigation of fairness in SGNNs expanded from GNNs.
We identify the issue of degree bias within signed graphs, offering a new perspective on the fairness issues related to SGNNs.
arXiv Detail & Related papers (2024-08-16T03:22:18Z) - Node Duplication Improves Cold-start Link Prediction [52.917775253887264]
Graph Neural Networks (GNNs) are prominent in graph machine learning.
Recent studies show that GNNs struggle to produce good results on low-degree nodes.
We propose a simple yet surprisingly effective augmentation technique called NodeDup.
arXiv Detail & Related papers (2024-02-15T05:07:39Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Geodesic Graph Neural Network for Efficient Graph Representation
Learning [34.047527874184134]
We propose an efficient GNN framework called Geodesic GNN (GDGNN)
It injects conditional relationships between nodes into the model without labeling.
Conditioned on the geodesic representations, GDGNN is able to generate node, link, and graph representations that carry much richer structural information than plain GNNs.
arXiv Detail & Related papers (2022-10-06T02:02:35Z) - ResNorm: Tackling Long-tailed Degree Distribution Issue in Graph Neural
Networks via Normalization [80.90206641975375]
This paper focuses on improving the performance of GNNs via normalization.
By studying the long-tailed distribution of node degrees in the graph, we propose a novel normalization method for GNNs.
The $scale$ operation of ResNorm reshapes the node-wise standard deviation (NStd) distribution so as to improve the accuracy of tail nodes.
arXiv Detail & Related papers (2022-06-16T13:49:09Z) - GraFN: Semi-Supervised Node Classification on Graph with Few Labels via
Non-Parametric Distribution Assignment [5.879936787990759]
We propose a novel semi-supervised method for graphs, GraFN, to ensure nodes that belong to the same class to be grouped together.
GraFN randomly samples support nodes from labeled nodes and anchor nodes from the entire graph.
We experimentally show that GraFN surpasses both the semi-supervised and self-supervised methods in terms of node classification on real-world graphs.
arXiv Detail & Related papers (2022-04-04T08:22:30Z) - Learning to Drop: Robust Graph Neural Network via Topological Denoising [50.81722989898142]
We propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of Graph Neural Networks (GNNs)
PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks.
We show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
arXiv Detail & Related papers (2020-11-13T18:53:21Z) - Investigating and Mitigating Degree-Related Biases in Graph
Convolutional Networks [62.8504260693664]
Graph Convolutional Networks (GCNs) show promising results for semisupervised learning tasks on graphs.
In this paper, we analyze GCNs in regard to the node degree distribution.
We develop a novel Self-Supervised DegreeSpecific GCN (SL-DSGC) that mitigates the degree biases of GCNs.
arXiv Detail & Related papers (2020-06-28T16:26:47Z) - Graph Random Neural Network for Semi-Supervised Learning on Graphs [36.218650686748546]
We study the problem of semi-supervised learning on graphs, for which graph neural networks (GNNs) have been extensively explored.
Most existing GNNs inherently suffer from the limitations of over-smoothing, non-robustness, and weak-generalization when labeled nodes are scarce.
In this paper, we propose a simple yet effective framework -- GRAPH R NEURAL NETWORKS (GRAND) -- to address these issues.
arXiv Detail & Related papers (2020-05-22T09:40:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.