UGSL: A Unified Framework for Benchmarking Graph Structure Learning
- URL: http://arxiv.org/abs/2308.10737v1
- Date: Mon, 21 Aug 2023 14:05:21 GMT
- Title: UGSL: A Unified Framework for Benchmarking Graph Structure Learning
- Authors: Bahare Fatemi, Sami Abu-El-Haija, Anton Tsitsulin, Mehran Kazemi,
Dustin Zelle, Neslihan Bulut, Jonathan Halcrow, Bryan Perozzi
- Abstract summary: We propose a benchmarking strategy for graph structure learning using a unified framework.
Our framework, called Unified Graph Structure Learning (UGSL), reformulates existing models into a single model.
Our results provide a clear and concise understanding of the different methods in this area as well as their strengths and weaknesses.
- Score: 19.936173198345053
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) demonstrate outstanding performance in a broad
range of applications. While the majority of GNN applications assume that a
graph structure is given, some recent methods substantially expanded the
applicability of GNNs by showing that they may be effective even when no graph
structure is explicitly provided. The GNN parameters and a graph structure are
jointly learned. Previous studies adopt different experimentation setups,
making it difficult to compare their merits. In this paper, we propose a
benchmarking strategy for graph structure learning using a unified framework.
Our framework, called Unified Graph Structure Learning (UGSL), reformulates
existing models into a single model. We implement a wide range of existing
models in our framework and conduct extensive analyses of the effectiveness of
different components in the framework. Our results provide a clear and concise
understanding of the different methods in this area as well as their strengths
and weaknesses. The benchmark code is available at
https://github.com/google-research/google-research/tree/master/ugsl.
Related papers
- GraphEdit: Large Language Models for Graph Structure Learning [62.618818029177355]
Graph Structure Learning (GSL) focuses on capturing intrinsic dependencies and interactions among nodes in graph-structured data.
Existing GSL methods heavily depend on explicit graph structural information as supervision signals.
We propose GraphEdit, an approach that leverages large language models (LLMs) to learn complex node relationships in graph-structured data.
arXiv Detail & Related papers (2024-02-23T08:29:42Z) - SE-GSL: A General and Effective Graph Structure Learning Framework
through Structural Entropy Optimization [67.28453445927825]
Graph Neural Networks (GNNs) are de facto solutions to structural data learning.
Existing graph structure learning (GSL) frameworks still lack robustness and interpretability.
This paper proposes a general GSL framework, SE-GSL, through structural entropy and the graph hierarchy abstracted in the encoding tree.
arXiv Detail & Related papers (2023-03-17T05:20:24Z) - Characterizing the Efficiency of Graph Neural Network Frameworks with a
Magnifying Glass [10.839902229218577]
Graph neural networks (GNNs) have received great attention due to their success in various graph-related learning tasks.
Recent GNNs have been developed with different graph sampling techniques for mini-batch training of GNNs on large graphs.
It is unknown how much the frameworks are 'eco-friendly' from a green computing perspective.
arXiv Detail & Related papers (2022-11-06T04:22:19Z) - A Comprehensive Study on Large-Scale Graph Training: Benchmarking and
Rethinking [124.21408098724551]
Large-scale graph training is a notoriously challenging problem for graph neural networks (GNNs)
We present a new ensembling training manner, named EnGCN, to address the existing issues.
Our proposed method has achieved new state-of-the-art (SOTA) performance on large-scale datasets.
arXiv Detail & Related papers (2022-10-14T03:43:05Z) - GPN: A Joint Structural Learning Framework for Graph Neural Networks [36.38529113603987]
We propose a GNN-based joint learning framework that simultaneously learns the graph structure and the downstream task.
Our method is the first GNN-based bilevel optimization framework for resolving this task.
arXiv Detail & Related papers (2022-05-12T09:06:04Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - Bridging the Gap between Spatial and Spectral Domains: A Unified
Framework for Graph Neural Networks [61.17075071853949]
Graph neural networks (GNNs) are designed to deal with graph-structural data that classical deep learning does not easily manage.
The purpose of this study is to establish a unified framework that integrates GNNs based on spectral graph and approximation theory.
arXiv Detail & Related papers (2021-07-21T17:34:33Z) - Theoretically Improving Graph Neural Networks via Anonymous Walk Graph
Kernels [25.736529232578178]
Graph neural networks (GNNs) have achieved tremendous success in graph mining.
MPGNNs, as the prevailing type of GNNs, have been theoretically shown unable to distinguish, detect or count many graph substructures.
We propose GSKN, a GNN model with a theoretically stronger ability to distinguish graph structures.
arXiv Detail & Related papers (2021-04-07T08:50:34Z) - Deep Graph Structure Learning for Robust Representations: A Survey [20.564611153151834]
Graph Neural Networks (GNNs) are widely used for analyzing graph-structured data.
To improve the robustness of GNN models, many studies have been proposed around the central concept of Graph Structure Learning.
arXiv Detail & Related papers (2021-03-04T13:49:25Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.