Graph Classification by Mixture of Diverse Experts
- URL: http://arxiv.org/abs/2103.15622v1
- Date: Mon, 29 Mar 2021 14:03:03 GMT
- Title: Graph Classification by Mixture of Diverse Experts
- Authors: Fenyu Hu, Liping Wang, Shu Wu, Liang Wang, Tieniu Tan
- Abstract summary: We present GraphDIVE, a framework leveraging mixture of diverse experts for imbalanced graph classification.
With a divide-and-conquer principle, GraphDIVE employs a gating network to partition an imbalanced graph dataset into several subsets.
Experiments on real-world imbalanced graph datasets demonstrate the effectiveness of GraphDIVE.
- Score: 67.33716357951235
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph classification is a challenging research problem in many applications
across a broad range of domains. In these applications, it is very common that
class distribution is imbalanced. Recently, Graph Neural Network (GNN) models
have achieved superior performance on various real-world datasets. Despite
their success, most of current GNN models largely overlook the important
setting of imbalanced class distribution, which typically results in prediction
bias towards majority classes. To alleviate the prediction bias, we propose to
leverage semantic structure of dataset based on the distribution of node
embedding. Specifically, we present GraphDIVE, a general framework leveraging
mixture of diverse experts (i.e., graph classifiers) for imbalanced graph
classification. With a divide-and-conquer principle, GraphDIVE employs a gating
network to partition an imbalanced graph dataset into several subsets. Then
each expert network is trained based on its corresponding subset. Experiments
on real-world imbalanced graph datasets demonstrate the effectiveness of
GraphDIVE.
Related papers
- Graph Classification via Reference Distribution Learning: Theory and Practice [24.74871206083017]
This work introduces Graph Reference Distribution Learning (GRDL), an efficient and accurate graph classification method.
GRDL treats each graph's latent node embeddings given by GNN layers as a discrete distribution, enabling direct classification without global pooling.
Experiments on moderate-scale and large-scale graph datasets show the superiority of GRDL over the state-of-the-art.
arXiv Detail & Related papers (2024-08-21T06:42:22Z) - Graph Fairness Learning under Distribution Shifts [33.9878682279549]
Graph neural networks (GNNs) have achieved remarkable performance on graph-structured data.
GNNs may inherit prejudice from the training data and make discriminatory predictions based on sensitive attributes, such as gender and race.
We propose a graph generator to produce numerous graphs with significant bias and under different distances.
arXiv Detail & Related papers (2024-01-30T06:51:24Z) - Graph Out-of-Distribution Generalization with Controllable Data
Augmentation [51.17476258673232]
Graph Neural Network (GNN) has demonstrated extraordinary performance in classifying graph properties.
Due to the selection bias of training and testing data, distribution deviation is widespread.
We propose OOD calibration to measure the distribution deviation of virtual samples.
arXiv Detail & Related papers (2023-08-16T13:10:27Z) - RAHNet: Retrieval Augmented Hybrid Network for Long-tailed Graph
Classification [10.806893809269074]
We propose a novel framework called Retrieval Augmented Hybrid Network (RAHNet) to jointly learn a robust feature extractor and an unbiased classifier.
In the feature extractor training stage, we develop a graph retrieval module to search for relevant graphs that directly enrich the intra-class diversity for the tail classes.
We also innovatively optimize a category-centered supervised contrastive loss to obtain discriminative representations.
arXiv Detail & Related papers (2023-08-04T14:06:44Z) - Debiasing Graph Neural Networks via Learning Disentangled Causal
Substructure [46.86463923605841]
We present a graph classification investigation on the training graphs with severe bias.
We discover that GNNs always tend to explore the spurious correlations to make decision.
We propose a general disentangled GNN framework to learn the causal substructure and bias substructure.
arXiv Detail & Related papers (2022-09-28T13:55:52Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - OOD-GNN: Out-of-Distribution Generalized Graph Neural Network [73.67049248445277]
Graph neural networks (GNNs) have achieved impressive performance when testing and training graph data come from identical distribution.
Existing GNNs lack out-of-distribution generalization abilities so that their performance substantially degrades when there exist distribution shifts between testing and training graph data.
We propose an out-of-distribution generalized graph neural network (OOD-GNN) for achieving satisfactory performance on unseen testing graphs that have different distributions with training graphs.
arXiv Detail & Related papers (2021-12-07T16:29:10Z) - Imbalanced Graph Classification via Graph-of-Graph Neural Networks [16.589373163769853]
Graph Neural Networks (GNNs) have achieved unprecedented success in learning graph representations to identify categorical labels of graphs.
We introduce a novel framework, Graph-of-Graph Neural Networks (G$2$GNN), which alleviates the graph imbalance issue by deriving extra supervision globally from neighboring graphs and locally from graphs themselves.
Our proposed G$2$GNN outperforms numerous baselines by roughly 5% in both F1-macro and F1-micro scores.
arXiv Detail & Related papers (2021-12-01T02:25:47Z) - Stable Prediction on Graphs with Agnostic Distribution Shift [105.12836224149633]
Graph neural networks (GNNs) have been shown to be effective on various graph tasks with randomly separated training and testing data.
In real applications, however, the distribution of training graph might be different from that of the test one.
We propose a novel stable prediction framework for GNNs, which permits both locally and globally stable learning and prediction on graphs.
arXiv Detail & Related papers (2021-10-08T02:45:47Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.