Graph Condensation: A Survey
- URL: http://arxiv.org/abs/2401.11720v2
- Date: Mon, 22 Jul 2024 12:39:21 GMT
- Title: Graph Condensation: A Survey
- Authors: Xinyi Gao, Junliang Yu, Tong Chen, Guanhua Ye, Wentao Zhang, Hongzhi Yin,
- Abstract summary: The rapid growth of graph data poses significant challenges in storage, transmission, and particularly the training of graph neural networks (GNNs)
To address these challenges, graph condensation (GC) has emerged as an innovative solution.
GC focuses on a compact yet highly representative graph, enabling GNNs trained on it to achieve performance comparable to those trained on the original large graph.
- Score: 49.41718583061147
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The rapid growth of graph data poses significant challenges in storage, transmission, and particularly the training of graph neural networks (GNNs). To address these challenges, graph condensation (GC) has emerged as an innovative solution. GC focuses on synthesizing a compact yet highly representative graph, enabling GNNs trained on it to achieve performance comparable to those trained on the original large graph. The notable efficacy of GC and its broad prospects have garnered significant attention and spurred extensive research. This survey paper provides an up-to-date and systematic overview of GC, organizing existing research into five categories aligned with critical GC evaluation criteria: effectiveness, generalization, efficiency, fairness, and robustness. To facilitate an in-depth and comprehensive understanding of GC, this paper examines various methods under each category and thoroughly discusses two essential components within GC: optimization strategies and condensed graph generation. We also empirically compare and analyze representative GC methods with diverse optimization strategies based on the five proposed GC evaluation criteria. Finally, we explore the applications of GC in various fields, outline the related open-source libraries, and highlight the present challenges and novel insights, with the aim of promoting advancements in future research. The related resources can be found at https://github.com/XYGaoG/Graph-Condensation-Papers.
Related papers
- GC-Bench: An Open and Unified Benchmark for Graph Condensation [54.70801435138878]
We develop a comprehensive Graph Condensation Benchmark (GC-Bench) to analyze the performance of graph condensation.
GC-Bench systematically investigates the characteristics of graph condensation in terms of the following dimensions: effectiveness, transferability, and complexity.
We have developed an easy-to-use library for training and evaluating different GC methods to facilitate reproducible research.
arXiv Detail & Related papers (2024-06-30T07:47:34Z) - GC4NC: A Benchmark Framework for Graph Condensation on Node Classification with New Insights [30.796414860754837]
Graph condensation (GC) is an emerging technique designed to learn a significantly smaller graph that retains the essential information of the original graph.
This paper introduces textbfGC4NC, a comprehensive framework for evaluating diverse GC methods on node classification.
Our systematic evaluation offers novel insights into how condensed graphs behave and the critical design choices that drive their success.
arXiv Detail & Related papers (2024-06-24T15:17:49Z) - RobGC: Towards Robust Graph Condensation [61.259453496191696]
Graph neural networks (GNNs) have attracted widespread attention for their impressive capability of graph representation learning.
However, the increasing prevalence of large-scale graphs presents a significant challenge for GNN training due to their computational demands.
We propose graph condensation (GC) to generate an informative compact graph that enables efficient training of GNNs while retaining performance.
arXiv Detail & Related papers (2024-06-19T04:14:57Z) - GCondenser: Benchmarking Graph Condensation [26.458605619132385]
This paper proposes the first large-scale graph condensation benchmark, GCondenser, to holistically evaluate and compare mainstream GC methods.
GCondenser includes a standardised GC paradigm, consisting of condensation, validation, and evaluation procedures, as well as enabling extensions to new GC methods and datasets.
arXiv Detail & Related papers (2024-05-23T07:25:31Z) - Rethinking and Accelerating Graph Condensation: A Training-Free Approach with Class Partition [56.26113670151363]
Graph condensation is a data-centric solution to replace the large graph with a small yet informative condensed graph.
Existing GC methods suffer from intricate optimization processes, necessitating excessive computing resources.
We propose a training-free GC framework termed Class-partitioned Graph Condensation (CGC)
CGC achieves state-of-the-art performance with a more efficient condensation process.
arXiv Detail & Related papers (2024-05-22T14:57:09Z) - A Survey on Graph Condensation [14.94630644865636]
Graph condensation (GC) has emerged as a solution to address challenges arising from the escalating volume of graph data.
For a better understanding of GC and to distinguish it from other related topics, we present a formal definition of GC and establish a taxonomy.
We conclude by addressing challenges and limitations, outlining future directions, and offering concise guidelines to inspire future research in this field.
arXiv Detail & Related papers (2024-02-03T02:50:51Z) - From Cluster Assumption to Graph Convolution: Graph-based Semi-Supervised Learning Revisited [51.24526202984846]
Graph-based semi-supervised learning (GSSL) has long been a hot research topic.
graph convolutional networks (GCNs) have become the predominant techniques for their promising performance.
arXiv Detail & Related papers (2023-09-24T10:10:21Z) - A Survey on Heterogeneous Graph Embedding: Methods, Techniques,
Applications and Sources [79.48829365560788]
Heterogeneous graphs (HGs) also known as heterogeneous information networks have become ubiquitous in real-world scenarios.
HG embedding aims to learn representations in a lower-dimension space while preserving the heterogeneous structures and semantics for downstream tasks.
arXiv Detail & Related papers (2020-11-30T15:03:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.