G-Mixup: Graph Data Augmentation for Graph Classification
- URL: http://arxiv.org/abs/2202.07179v2
- Date: Wed, 16 Feb 2022 05:15:02 GMT
- Title: G-Mixup: Graph Data Augmentation for Graph Classification
- Authors: Xiaotian Han, Zhimeng Jiang, Ninghao Liu, Xia Hu
- Abstract summary: Mixup has shown superiority in improving the generalization and robustness of neural networks by interpolating features and labels between two random samples.
We propose $mathcalG$-Mixup to augment graphs for graph classification by interpolating the generator (i.e., graphon) of different classes of graphs.
Experiments show that $mathcalG$-Mixup substantially improves the generalization and robustness of GNNs.
- Score: 55.63157775049443
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work develops \emph{mixup for graph data}. Mixup has shown superiority
in improving the generalization and robustness of neural networks by
interpolating features and labels between two random samples. Traditionally,
Mixup can work on regular, grid-like, and Euclidean data such as image or
tabular data. However, it is challenging to directly adopt Mixup to augment
graph data because different graphs typically: 1) have different numbers of
nodes; 2) are not readily aligned; and 3) have unique typologies in
non-Euclidean space. To this end, we propose $\mathcal{G}$-Mixup to augment
graphs for graph classification by interpolating the generator (i.e., graphon)
of different classes of graphs. Specifically, we first use graphs within the
same class to estimate a graphon. Then, instead of directly manipulating
graphs, we interpolate graphons of different classes in the Euclidean space to
get mixed graphons, where the synthetic graphs are generated through sampling
based on the mixed graphons. Extensive experiments show that
$\mathcal{G}$-Mixup substantially improves the generalization and robustness of
GNNs.
Related papers
- Homophily-Related: Adaptive Hybrid Graph Filter for Multi-View Graph
Clustering [29.17784041837907]
We propose Adaptive Hybrid Graph Filter for Multi-View Graph Clustering (AHGFC)
AHGFC learns the node embedding based on the graph joint aggregation matrix.
Experimental results show that our proposed model performs well on six datasets containing homophilous and heterophilous graphs.
arXiv Detail & Related papers (2024-01-05T07:27:29Z) - Finding the Missing-half: Graph Complementary Learning for
Homophily-prone and Heterophily-prone Graphs [48.79929516665371]
Graphs with homophily-prone edges tend to connect nodes with the same class.
Heterophily-prone edges tend to build relationships between nodes with different classes.
Existing GNNs only take the original graph during training.
arXiv Detail & Related papers (2023-06-13T08:06:10Z) - Graph Mixup with Soft Alignments [49.61520432554505]
We study graph data augmentation by mixup, which has been used successfully on images.
We propose S-Mixup, a simple yet effective mixup method for graph classification by soft alignments.
arXiv Detail & Related papers (2023-06-11T22:04:28Z) - Beyond Homophily: Reconstructing Structure for Graph-agnostic Clustering [15.764819403555512]
It is impossible to first identify a graph as homophilic or heterophilic before a suitable GNN model can be found.
We propose a novel graph clustering method, which contains three key components: graph reconstruction, a mixed filter, and dual graph clustering network.
Our method dominates others on heterophilic graphs.
arXiv Detail & Related papers (2023-05-03T01:49:01Z) - Graph Mixture of Experts: Learning on Large-Scale Graphs with Explicit
Diversity Modeling [60.0185734837814]
Graph neural networks (GNNs) have found extensive applications in learning from graph data.
To bolster the generalization capacity of GNNs, it has become customary to augment training graph structures with techniques like graph augmentations.
This study introduces the concept of Mixture-of-Experts (MoE) to GNNs, with the aim of augmenting their capacity to adapt to a diverse range of training graph structures.
arXiv Detail & Related papers (2023-04-06T01:09:36Z) - Model-Agnostic Augmentation for Accurate Graph Classification [19.824105919844495]
Graph augmentation is an essential strategy to improve the performance of graph-based tasks.
In this work, we introduce five desired properties for effective augmentation.
Our experiments on social networks and molecular graphs show that NodeSam and SubMix outperform existing approaches in graph classification.
arXiv Detail & Related papers (2022-02-21T10:37:53Z) - AnchorGAE: General Data Clustering via $O(n)$ Bipartite Graph
Convolution [79.44066256794187]
We show how to convert a non-graph dataset into a graph by introducing the generative graph model, which is used to build graph convolution networks (GCNs)
A bipartite graph constructed by anchors is updated dynamically to exploit the high-level information behind data.
We theoretically prove that the simple update will lead to degeneration and a specific strategy is accordingly designed.
arXiv Detail & Related papers (2021-11-12T07:08:13Z) - Intrusion-Free Graph Mixup [33.07540841212878]
We present a simple and yet effective regularization technique to improve the generalization of Graph Neural Networks (GNNs)
We leverage the recent advances in Mixup regularizer for vision and text, where random sample pairs and their labels are interpolated to create synthetic samples for training.
Our method can effectively regularize the graph classification learning, resulting in superior predictive accuracy over popular graph augmentation baselines.
arXiv Detail & Related papers (2021-10-18T14:16:00Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.