Graph Condensation via Receptive Field Distribution Matching
- URL: http://arxiv.org/abs/2206.13697v1
- Date: Tue, 28 Jun 2022 02:10:05 GMT
- Title: Graph Condensation via Receptive Field Distribution Matching
- Authors: Mengyang Liu, Shanchuan Li, Xinshi Chen, Le Song
- Abstract summary: This paper focuses on creating a small graph to represent the original graph, so that GNNs trained on the size-reduced graph can make accurate predictions.
We view the original graph as a distribution of receptive fields and aim to synthesize a small graph whose receptive fields share a similar distribution.
- Score: 61.71711656856704
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) enable the analysis of graphs using deep
learning, with promising results in capturing structured information in graphs.
This paper focuses on creating a small graph to represent the original graph,
so that GNNs trained on the size-reduced graph can make accurate predictions.
We view the original graph as a distribution of receptive fields and aim to
synthesize a small graph whose receptive fields share a similar distribution.
Thus, we propose Graph Condesation via Receptive Field Distribution Matching
(GCDM), which is accomplished by optimizing the synthetic graph through the use
of a distribution matching loss quantified by maximum mean discrepancy (MMD).
Additionally, we demonstrate that the synthetic graph generated by GCDM is
highly generalizable to a variety of models in evaluation phase and that the
condensing speed is significantly improved using this framework.
Related papers
- Graph Neural Networks with a Distribution of Parametrized Graphs [27.40566674759208]
We introduce latent variables to parameterize and generate multiple graphs.
We obtain the maximum likelihood estimate of the network parameters in an Expectation-Maximization framework.
arXiv Detail & Related papers (2023-10-25T06:38:24Z) - GDM: Dual Mixup for Graph Classification with Limited Supervision [27.8982897698616]
Graph Neural Networks (GNNs) require a large number of labeled graph samples to obtain good performance on the graph classification task.
The performance of GNNs degrades significantly as the number of labeled graph samples decreases.
We propose a novel mixup-based graph augmentation method to generate new labeled graph samples.
arXiv Detail & Related papers (2023-09-18T20:17:10Z) - Graph Out-of-Distribution Generalization with Controllable Data
Augmentation [51.17476258673232]
Graph Neural Network (GNN) has demonstrated extraordinary performance in classifying graph properties.
Due to the selection bias of training and testing data, distribution deviation is widespread.
We propose OOD calibration to measure the distribution deviation of virtual samples.
arXiv Detail & Related papers (2023-08-16T13:10:27Z) - A Spectral Analysis of Graph Neural Networks on Dense and Sparse Graphs [13.954735096637298]
We analyze how sparsity affects the graph spectra, and thus the performance of graph neural networks (GNNs) in node classification on dense and sparse graphs.
We show that GNNs can outperform spectral methods on sparse graphs, and illustrate these results with numerical examples on both synthetic and real graphs.
arXiv Detail & Related papers (2022-11-06T22:38:13Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - OOD-GNN: Out-of-Distribution Generalized Graph Neural Network [73.67049248445277]
Graph neural networks (GNNs) have achieved impressive performance when testing and training graph data come from identical distribution.
Existing GNNs lack out-of-distribution generalization abilities so that their performance substantially degrades when there exist distribution shifts between testing and training graph data.
We propose an out-of-distribution generalized graph neural network (OOD-GNN) for achieving satisfactory performance on unseen testing graphs that have different distributions with training graphs.
arXiv Detail & Related papers (2021-12-07T16:29:10Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.