Distance-wise Prototypical Graph Neural Network in Node Imbalance
Classification
- URL: http://arxiv.org/abs/2110.12035v1
- Date: Fri, 22 Oct 2021 19:43:15 GMT
- Title: Distance-wise Prototypical Graph Neural Network in Node Imbalance
Classification
- Authors: Yu Wang, Charu Aggarwal, Tyler Derr
- Abstract summary: We propose a novel Distance-wise Prototypical Graph Neural Network (DPGNN) for imbalanced graph data.
The proposed DPGNN almost always significantly outperforms all other baselines, which demonstrates its effectiveness in imbalanced node classification.
- Score: 9.755229198654922
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent years have witnessed the significant success of applying graph neural
networks (GNNs) in learning effective node representations for classification.
However, current GNNs are mostly built under the balanced data-splitting, which
is inconsistent with many real-world networks where the number of training
nodes can be extremely imbalanced among the classes. Thus, directly utilizing
current GNNs on imbalanced data would generate coarse representations of nodes
in minority classes and ultimately compromise the classification performance.
This therefore portends the importance of developing effective GNNs for
handling imbalanced graph data. In this work, we propose a novel Distance-wise
Prototypical Graph Neural Network (DPGNN), which proposes a class
prototype-driven training to balance the training loss between majority and
minority classes and then leverages distance metric learning to differentiate
the contributions of different dimensions of representations and fully encode
the relative position of each node to each class prototype. Moreover, we design
a new imbalanced label propagation mechanism to derive extra supervision from
unlabeled nodes and employ self-supervised learning to smooth representations
of adjacent nodes while separating inter-class prototypes. Comprehensive node
classification experiments and parameter analysis on multiple networks are
conducted and the proposed DPGNN almost always significantly outperforms all
other baselines, which demonstrates its effectiveness in imbalanced node
classification. The implementation of DPGNN is available at
\url{https://github.com/YuWVandy/DPGNN}.
Related papers
- Graph Mining under Data scarcity [6.229055041065048]
We propose an Uncertainty Estimator framework that can be applied on top of any generic Graph Neural Networks (GNNs)
We train these models under the classic episodic learning paradigm in the $n$-way, $k$-shot fashion, in an end-to-end setting.
Our method outperforms the baselines, which demonstrates the efficacy of the Uncertainty Estimator for Few-shot node classification on graphs with a GNN.
arXiv Detail & Related papers (2024-06-07T10:50:03Z) - DFA-GNN: Forward Learning of Graph Neural Networks by Direct Feedback Alignment [57.62885438406724]
Graph neural networks are recognized for their strong performance across various applications.
BP has limitations that challenge its biological plausibility and affect the efficiency, scalability and parallelism of training neural networks for graph-based tasks.
We propose DFA-GNN, a novel forward learning framework tailored for GNNs with a case study of semi-supervised learning.
arXiv Detail & Related papers (2024-06-04T07:24:51Z) - GNNEvaluator: Evaluating GNN Performance On Unseen Graphs Without Labels [81.93520935479984]
We study a new problem, GNN model evaluation, that aims to assess the performance of a specific GNN model trained on labeled and observed graphs.
We propose a two-stage GNN model evaluation framework, including (1) DiscGraph set construction and (2) GNNEvaluator training and inference.
Under the effective training supervision from the DiscGraph set, GNNEvaluator learns to precisely estimate node classification accuracy of the to-be-evaluated GNN model.
arXiv Detail & Related papers (2023-10-23T05:51:59Z) - Heterophily-Based Graph Neural Network for Imbalanced Classification [19.51668009720269]
We introduce a unique approach that tackles imbalanced classification on graphs by considering graph heterophily.
We propose Fast Im-GBK, which integrates an imbalance classification strategy with heterophily-aware GNNs.
Our experiments on real-world graphs demonstrate our model's superiority in classification performance and efficiency for node classification tasks.
arXiv Detail & Related papers (2023-10-12T21:19:47Z) - Label Deconvolution for Node Representation Learning on Large-scale
Attributed Graphs against Learning Bias [75.44877675117749]
We propose an efficient label regularization technique, namely Label Deconvolution (LD), to alleviate the learning bias by a novel and highly scalable approximation to the inverse mapping of GNNs.
Experiments demonstrate LD significantly outperforms state-of-the-art methods on Open Graph datasets Benchmark.
arXiv Detail & Related papers (2023-09-26T13:09:43Z) - When Do Graph Neural Networks Help with Node Classification?
Investigating the Impact of Homophily Principle on Node Distinguishability [92.8279562472538]
Homophily principle has been believed to be the main reason for the performance superiority of Graph Networks (GNNs) over Neural Networks on node classification tasks.
Recent research suggests that, even in the absence of homophily, the advantage of GNNs still exists as long as nodes from the same class share similar neighborhood patterns.
arXiv Detail & Related papers (2023-04-25T09:40:47Z) - Graph Neural Network with Curriculum Learning for Imbalanced Node
Classification [21.085314408929058]
Graph Neural Network (GNN) is an emerging technique for graph-based learning tasks such as node classification.
In this work, we reveal the vulnerability of GNN to the imbalance of node labels.
We propose a novel graph neural network framework with curriculum learning (GNN-CL) consisting of two modules.
arXiv Detail & Related papers (2022-02-05T10:46:11Z) - Shift-Robust GNNs: Overcoming the Limitations of Localized Graph
Training data [52.771780951404565]
Shift-Robust GNN (SR-GNN) is designed to account for distributional differences between biased training data and the graph's true inference distribution.
We show that SR-GNN outperforms other GNN baselines by accuracy, eliminating at least (40%) of the negative effects introduced by biased training data.
arXiv Detail & Related papers (2021-08-02T18:00:38Z) - ImGAGN:Imbalanced Network Embedding via Generative Adversarial Graph
Networks [19.45752945234785]
Imbalanced classification on graphs is ubiquitous yet challenging in many real-world applications, such as fraudulent node detection.
We present a generative adversarial graph network model, called ImGAGN, to address the imbalanced classification problem on graphs.
We show that the proposed method ImGAGN outperforms state-of-the-art algorithms for semi-supervised imbalanced node classification task.
arXiv Detail & Related papers (2021-06-05T06:56:37Z) - A Collective Learning Framework to Boost GNN Expressiveness [25.394456460032625]
We consider the task of inductive node classification using Graph Neural Networks (GNNs) in supervised and semi-supervised settings.
We propose a general collective learning approach to increase the representation power of any existing GNN.
We evaluate performance on five real-world network datasets and demonstrate consistent, significant improvement in node classification accuracy.
arXiv Detail & Related papers (2020-03-26T22:07:28Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.