GCondenser: Benchmarking Graph Condensation
- URL: http://arxiv.org/abs/2405.14246v3
- Date: Wed, 10 Jul 2024 04:01:55 GMT
- Title: GCondenser: Benchmarking Graph Condensation
- Authors: Yilun Liu, Ruihong Qiu, Zi Huang,
- Abstract summary: This paper proposes the first large-scale graph condensation benchmark, GCondenser, to holistically evaluate and compare mainstream GC methods.
GCondenser includes a standardised GC paradigm, consisting of condensation, validation, and evaluation procedures, as well as enabling extensions to new GC methods and datasets.
- Score: 26.458605619132385
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Large-scale graphs are valuable for graph representation learning, yet the abundant data in these graphs hinders the efficiency of the training process. Graph condensation (GC) alleviates this issue by compressing the large graph into a significantly smaller one that still supports effective model training. Although recent research has introduced various approaches to improve the effectiveness of the condensed graph, comprehensive and practical evaluations across different GC methods are neglected. This paper proposes the first large-scale graph condensation benchmark, GCondenser, to holistically evaluate and compare mainstream GC methods. GCondenser includes a standardised GC paradigm, consisting of condensation, validation, and evaluation procedures, as well as enabling extensions to new GC methods and datasets. With GCondenser, a comprehensive performance study is conducted, presenting the effectiveness of existing methods. GCondenser is open-sourced and available at https://github.com/superallen13/GCondenser.
Related papers
- GC-Bench: An Open and Unified Benchmark for Graph Condensation [54.70801435138878]
We develop a comprehensive Graph Condensation Benchmark (GC-Bench) to analyze the performance of graph condensation.
GC-Bench systematically investigates the characteristics of graph condensation in terms of the following dimensions: effectiveness, transferability, and complexity.
We have developed an easy-to-use library for training and evaluating different GC methods to facilitate reproducible research.
arXiv Detail & Related papers (2024-06-30T07:47:34Z) - GC4NC: A Benchmark Framework for Graph Condensation on Node Classification with New Insights [30.796414860754837]
Graph condensation (GC) is an emerging technique designed to learn a significantly smaller graph that retains the essential information of the original graph.
This paper introduces textbfGC4NC, a comprehensive framework for evaluating diverse GC methods on node classification.
Our systematic evaluation offers novel insights into how condensed graphs behave and the critical design choices that drive their success.
arXiv Detail & Related papers (2024-06-24T15:17:49Z) - RobGC: Towards Robust Graph Condensation [61.259453496191696]
Graph neural networks (GNNs) have attracted widespread attention for their impressive capability of graph representation learning.
However, the increasing prevalence of large-scale graphs presents a significant challenge for GNN training due to their computational demands.
We propose graph condensation (GC) to generate an informative compact graph that enables efficient training of GNNs while retaining performance.
arXiv Detail & Related papers (2024-06-19T04:14:57Z) - Rethinking and Accelerating Graph Condensation: A Training-Free Approach with Class Partition [56.26113670151363]
Graph condensation is a data-centric solution to replace the large graph with a small yet informative condensed graph.
Existing GC methods suffer from intricate optimization processes, necessitating excessive computing resources.
We propose a training-free GC framework termed Class-partitioned Graph Condensation (CGC)
CGC achieves state-of-the-art performance with a more efficient condensation process.
arXiv Detail & Related papers (2024-05-22T14:57:09Z) - EXGC: Bridging Efficiency and Explainability in Graph Condensation [30.60535282372542]
Graph condensation (GCond) has been introduced to distill large real datasets into a more concise yet information-rich synthetic graph.
Despite acceleration efforts, existing GCond methods mainly grapple with efficiency, especially on expansive web data graphs.
We propose the Efficient and eXplainable Graph Condensation method, which can markedly boost efficiency and inject explainability.
arXiv Detail & Related papers (2024-02-05T06:03:38Z) - Graph Condensation: A Survey [49.41718583061147]
The rapid growth of graph data poses significant challenges in storage, transmission, and particularly the training of graph neural networks (GNNs)
To address these challenges, graph condensation (GC) has emerged as an innovative solution.
GC focuses on a compact yet highly representative graph, enabling GNNs trained on it to achieve performance comparable to those trained on the original large graph.
arXiv Detail & Related papers (2024-01-22T06:47:00Z) - From Cluster Assumption to Graph Convolution: Graph-based Semi-Supervised Learning Revisited [51.24526202984846]
Graph-based semi-supervised learning (GSSL) has long been a hot research topic.
graph convolutional networks (GCNs) have become the predominant techniques for their promising performance.
arXiv Detail & Related papers (2023-09-24T10:10:21Z) - GraphCoCo: Graph Complementary Contrastive Learning [65.89743197355722]
Graph Contrastive Learning (GCL) has shown promising performance in graph representation learning (GRL) without the supervision of manual annotations.
This paper proposes an effective graph complementary contrastive learning approach named GraphCoCo to tackle the above issue.
arXiv Detail & Related papers (2022-03-24T02:58:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.