GSLB: The Graph Structure Learning Benchmark
- URL: http://arxiv.org/abs/2310.05174v1
- Date: Sun, 8 Oct 2023 14:13:03 GMT
- Title: GSLB: The Graph Structure Learning Benchmark
- Authors: Zhixun Li, Liang Wang, Xin Sun, Yifan Luo, Yanqiao Zhu, Dingshuo Chen,
Yingtao Luo, Xiangxin Zhou, Qiang Liu, Shu Wu, Liang Wang, Jeffrey Xu Yu
- Abstract summary: Graph Structure Learning (GSL) has recently garnered considerable attention due to its ability to optimize both the parameters of Graph Neural Networks (GNNs) and the computation graph structure simultaneously.
There is no standard experimental setting or fair comparison for performance evaluation, which creates a great obstacle to understanding the progress in this field.
We develop a comprehensive Graph Structure Learning Benchmark (GSLB) curated from 20 diverse graph datasets and 16 distinct GSL algorithms.
- Score: 34.859275408785614
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Structure Learning (GSL) has recently garnered considerable attention
due to its ability to optimize both the parameters of Graph Neural Networks
(GNNs) and the computation graph structure simultaneously. Despite the
proliferation of GSL methods developed in recent years, there is no standard
experimental setting or fair comparison for performance evaluation, which
creates a great obstacle to understanding the progress in this field. To fill
this gap, we systematically analyze the performance of GSL in different
scenarios and develop a comprehensive Graph Structure Learning Benchmark (GSLB)
curated from 20 diverse graph datasets and 16 distinct GSL algorithms.
Specifically, GSLB systematically investigates the characteristics of GSL in
terms of three dimensions: effectiveness, robustness, and complexity. We
comprehensively evaluate state-of-the-art GSL algorithms in node- and
graph-level tasks, and analyze their performance in robust learning and model
complexity. Further, to facilitate reproducible research, we have developed an
easy-to-use library for training, evaluating, and visualizing different GSL
methods. Empirical results of our extensive experiments demonstrate the ability
of GSL and reveal its potential benefits on various downstream tasks, offering
insights and opportunities for future research. The code of GSLB is available
at: https://github.com/GSL-Benchmark/GSLB.
Related papers
- Enhancing Graph Self-Supervised Learning with Graph Interplay [8.775644935074407]
Graph Interplay (GIP) is an innovative and versatile approach that significantly enhances the performance equipped with various existing GSSL methods.
GIP advocates introducing direct graph-level communications by random inter-graph edges within standard batches.
Our empirical study demonstrates that GIP surpasses the performance of prevailing GSSL methods by significant margins.
arXiv Detail & Related papers (2024-10-05T07:05:21Z) - Every Node is Different: Dynamically Fusing Self-Supervised Tasks for
Attributed Graph Clustering [59.45743537594695]
We propose Dynamically Fusing Self-Supervised Learning (DyFSS) for graph clustering.
DyFSS fuses features extracted from diverse SSL tasks using distinct weights derived from a gating network.
Experiments show DyFSS outperforms state-of-the-art multi-task SSL methods by up to 8.66% on the accuracy metric.
arXiv Detail & Related papers (2024-01-12T14:24:10Z) - Robust Graph Structure Learning with the Alignment of Features and
Adjacency Matrix [8.711977569042865]
Many approaches have been proposed for graph structure learning (GSL) to jointly learn a clean graph structure and corresponding representations.
This paper proposes a novel regularized GSL approach, particularly with an alignment of feature information and graph information.
We conduct experiments on real-world graphs to evaluate the effectiveness of our approach.
arXiv Detail & Related papers (2023-07-05T09:05:14Z) - OpenGSL: A Comprehensive Benchmark for Graph Structure Learning [40.50100033304329]
We introduce OpenGSL, the first comprehensive benchmark for Graph Structure Learning (GSL)
OpenGSL enables a fair comparison among state-of-the-art GSL methods by evaluating them across various popular datasets.
We find that there is no significant correlation between the homophily of the learned structure and task performance, challenging the common belief.
arXiv Detail & Related papers (2023-06-17T07:22:25Z) - On the Transferability of Visual Features in Generalized Zero-Shot
Learning [28.120004119724577]
Generalized Zero-Shot Learning (GZSL) aims to train a classifier that can generalize to unseen classes.
In this work, we investigate the utility of different GZSL methods when using different feature extractors.
We also examine how these models' pre-training objectives, datasets, and architecture design affect their feature representation ability.
arXiv Detail & Related papers (2022-11-22T18:59:09Z) - Compact Graph Structure Learning via Mutual Information Compression [79.225671302689]
Graph Structure Learning (GSL) has attracted considerable attentions in its capacity of optimizing graph structure and learning parameters of Graph Neural Networks (GNNs)
We propose a Compact GSL architecture by MI compression, named CoGSL.
We conduct extensive experiments on several datasets under clean and attacked conditions, which demonstrate the effectiveness and robustness of CoGSL.
arXiv Detail & Related papers (2022-01-14T16:22:33Z) - Graph Structure Learning with Variational Information Bottleneck [70.62851953251253]
We propose a novel Variational Information Bottleneck guided Graph Structure Learning framework, namely VIB-GSL.
VIB-GSL learns an informative and compressive graph structure to distill the actionable information for specific downstream tasks.
arXiv Detail & Related papers (2021-12-16T14:22:13Z) - Deep Graph Structure Learning for Robust Representations: A Survey [20.564611153151834]
Graph Neural Networks (GNNs) are widely used for analyzing graph-structured data.
To improve the robustness of GNN models, many studies have been proposed around the central concept of Graph Structure Learning.
arXiv Detail & Related papers (2021-03-04T13:49:25Z) - Graph-based Semi-supervised Learning: A Comprehensive Review [51.26862262550445]
Semi-supervised learning (SSL) has tremendous value in practice due to its ability to utilize both labeled data and unlabelled data.
An important class of SSL methods is to naturally represent data as graphs, which corresponds to graph-based semi-supervised learning (GSSL) methods.
GSSL methods have demonstrated their advantages in various domains due to their uniqueness of structure, the universality of applications, and their scalability to large scale data.
arXiv Detail & Related papers (2021-02-26T05:11:09Z) - Self-Supervised Learning of Graph Neural Networks: A Unified Review [50.71341657322391]
Self-supervised learning is emerging as a new paradigm for making use of large amounts of unlabeled samples.
We provide a unified review of different ways of training graph neural networks (GNNs) using SSL.
Our treatment of SSL methods for GNNs sheds light on the similarities and differences of various methods, setting the stage for developing new methods and algorithms.
arXiv Detail & Related papers (2021-02-22T03:43:45Z) - Self-supervised Learning on Graphs: Deep Insights and New Direction [66.78374374440467]
Self-supervised learning (SSL) aims to create domain specific pretext tasks on unlabeled data.
There are increasing interests in generalizing deep learning to the graph domain in the form of graph neural networks (GNNs)
arXiv Detail & Related papers (2020-06-17T20:30:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.