GraphMixup: Improving Class-Imbalanced Node Classification on Graphs by
Self-supervised Context Prediction
- URL: http://arxiv.org/abs/2106.11133v1
- Date: Mon, 21 Jun 2021 14:12:16 GMT
- Title: GraphMixup: Improving Class-Imbalanced Node Classification on Graphs by
Self-supervised Context Prediction
- Authors: Lirong Wu, Haitao Lin, Zhangyang Gao, Cheng Tan, Stan.Z.Li
- Abstract summary: This paper presents GraphMixup, a novel mixup-based framework for improving class-imbalanced node classification on graphs.
We develop a emphReinforcement Mixup mechanism to adaptively determine how many samples are to be generated by mixup for those minority classes.
Experiments on three real-world datasets show that GraphMixup yields truly encouraging results for class-imbalanced node classification tasks.
- Score: 25.679620842010422
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent years have witnessed great success in handling node classification
tasks with Graph Neural Networks (GNNs). However, most existing GNNs are based
on the assumption that node samples for different classes are balanced, while
for many real-world graphs, there exists the problem of class imbalance, i.e.,
some classes may have much fewer samples than others. In this case, directly
training a GNN classifier with raw data would under-represent samples from
those minority classes and result in sub-optimal performance. This paper
presents GraphMixup, a novel mixup-based framework for improving
class-imbalanced node classification on graphs. However, directly performing
mixup in the input space or embedding space may produce out-of-domain samples
due to the extreme sparsity of minority classes; hence we construct semantic
relation spaces that allows the Feature Mixup to be performed at the semantic
level. Moreover, we apply two context-based self-supervised techniques to
capture both local and global information in the graph structure and then
propose Edge Mixup specifically for graph data. Finally, we develop a
\emph{Reinforcement Mixup} mechanism to adaptively determine how many samples
are to be generated by mixup for those minority classes. Extensive experiments
on three real-world datasets show that GraphMixup yields truly encouraging
results for class-imbalanced node classification tasks.
Related papers
- Open-World Semi-Supervised Learning for Node Classification [53.07866559269709]
Open-world semi-supervised learning (Open-world SSL) for node classification is a practical but under-explored problem in the graph community.
We propose an IMbalance-Aware method named OpenIMA for Open-world semi-supervised node classification.
arXiv Detail & Related papers (2024-03-18T05:12:54Z) - Chasing Fairness in Graphs: A GNN Architecture Perspective [73.43111851492593]
We propose textsfFair textsfMessage textsfPassing (FMP) designed within a unified optimization framework for graph neural networks (GNNs)
In FMP, the aggregation is first adopted to utilize neighbors' information and then the bias mitigation step explicitly pushes demographic group node presentation centers together.
Experiments on node classification tasks demonstrate that the proposed FMP outperforms several baselines in terms of fairness and accuracy on three real-world datasets.
arXiv Detail & Related papers (2023-12-19T18:00:15Z) - Heterophily-Based Graph Neural Network for Imbalanced Classification [19.51668009720269]
We introduce a unique approach that tackles imbalanced classification on graphs by considering graph heterophily.
We propose Fast Im-GBK, which integrates an imbalance classification strategy with heterophily-aware GNNs.
Our experiments on real-world graphs demonstrate our model's superiority in classification performance and efficiency for node classification tasks.
arXiv Detail & Related papers (2023-10-12T21:19:47Z) - GraphSHA: Synthesizing Harder Samples for Class-Imbalanced Node
Classification [64.85392028383164]
Class imbalance is the phenomenon that some classes have much fewer instances than others.
Recent studies find that off-the-shelf Graph Neural Networks (GNNs) would under-represent minor class samples.
We propose a general framework GraphSHA by Synthesizing HArder minor samples.
arXiv Detail & Related papers (2023-06-16T04:05:58Z) - Synthetic Over-sampling for Imbalanced Node Classification with Graph
Neural Networks [34.81248024048974]
Graph neural networks (GNNs) have achieved state-of-the-art performance for node classification.
In many real-world scenarios, node classes are imbalanced, with some majority classes making up most parts of the graph.
In this work, we seek to address this problem by generating pseudo instances of minority classes to balance the training data.
arXiv Detail & Related papers (2022-06-10T19:47:05Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Imbalanced Graph Classification via Graph-of-Graph Neural Networks [16.589373163769853]
Graph Neural Networks (GNNs) have achieved unprecedented success in learning graph representations to identify categorical labels of graphs.
We introduce a novel framework, Graph-of-Graph Neural Networks (G$2$GNN), which alleviates the graph imbalance issue by deriving extra supervision globally from neighboring graphs and locally from graphs themselves.
Our proposed G$2$GNN outperforms numerous baselines by roughly 5% in both F1-macro and F1-micro scores.
arXiv Detail & Related papers (2021-12-01T02:25:47Z) - Learning Hierarchical Graph Neural Networks for Image Clustering [81.5841862489509]
We propose a hierarchical graph neural network (GNN) model that learns how to cluster a set of images into an unknown number of identities.
Our hierarchical GNN uses a novel approach to merge connected components predicted at each level of the hierarchy to form a new graph at the next level.
arXiv Detail & Related papers (2021-07-03T01:28:42Z) - Graph Classification by Mixture of Diverse Experts [67.33716357951235]
We present GraphDIVE, a framework leveraging mixture of diverse experts for imbalanced graph classification.
With a divide-and-conquer principle, GraphDIVE employs a gating network to partition an imbalanced graph dataset into several subsets.
Experiments on real-world imbalanced graph datasets demonstrate the effectiveness of GraphDIVE.
arXiv Detail & Related papers (2021-03-29T14:03:03Z) - GraphSMOTE: Imbalanced Node Classification on Graphs with Graph Neural
Networks [28.92347073786722]
Graph neural networks (GNNs) have achieved state-of-the-art performance of node classification.
We propose a novel framework, GraphSMOTE, in which an embedding space is constructed to encode the similarity among the nodes.
New samples are synthesize in this space to assure genuineness.
arXiv Detail & Related papers (2021-03-16T03:23:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.