GraphGLOW: Universal and Generalizable Structure Learning for Graph
Neural Networks
- URL: http://arxiv.org/abs/2306.11264v1
- Date: Tue, 20 Jun 2023 03:33:22 GMT
- Title: GraphGLOW: Universal and Generalizable Structure Learning for Graph
Neural Networks
- Authors: Wentao Zhao, Qitian Wu, Chenxiao Yang and Junchi Yan
- Abstract summary: This paper introduces the mathematical definition of this novel problem setting.
We devise a general framework that coordinates a single graph-shared structure learner and multiple graph-specific GNNs.
The well-trained structure learner can directly produce adaptive structures for unseen target graphs without any fine-tuning.
- Score: 72.01829954658889
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph structure learning is a well-established problem that aims at
optimizing graph structures adaptive to specific graph datasets to help message
passing neural networks (i.e., GNNs) to yield effective and robust node
embeddings. However, the common limitation of existing models lies in the
underlying \textit{closed-world assumption}: the testing graph is the same as
the training graph. This premise requires independently training the structure
learning model from scratch for each graph dataset, which leads to prohibitive
computation costs and potential risks for serious over-fitting. To mitigate
these issues, this paper explores a new direction that moves forward to learn a
universal structure learning model that can generalize across graph datasets in
an open world. We first introduce the mathematical definition of this novel
problem setting, and describe the model formulation from a probabilistic
data-generative aspect. Then we devise a general framework that coordinates a
single graph-shared structure learner and multiple graph-specific GNNs to
capture the generalizable patterns of optimal message-passing topology across
datasets. The well-trained structure learner can directly produce adaptive
structures for unseen target graphs without any fine-tuning. Across diverse
datasets and various challenging cross-graph generalization protocols, our
experiments show that even without training on target graphs, the proposed
model i) significantly outperforms expressive GNNs trained on input
(non-optimized) topology, and ii) surprisingly performs on par with
state-of-the-art models that independently optimize adaptive structures for
specific target graphs, with notably orders-of-magnitude acceleration for
training on the target graph.
Related papers
- AnyGraph: Graph Foundation Model in the Wild [16.313146933922752]
Graph foundation models offer the potential to learn robust, generalizable representations from graph data.
In this work, we investigate a unified graph model, AnyGraph, designed to handle key challenges.
Our experiments on diverse 38 graph datasets have demonstrated the strong zero-shot learning performance of AnyGraph.
arXiv Detail & Related papers (2024-08-20T09:57:13Z) - UniGraph: Learning a Unified Cross-Domain Foundation Model for Text-Attributed Graphs [30.635472655668078]
Text-Attributed Graphs (TAGs) can generalize to unseen graphs and tasks across diverse domains.
We propose a novel cascaded architecture of Language Models (LMs) and Graph Neural Networks (GNNs) as backbone networks.
We demonstrate the model's effectiveness in self-supervised representation learning on unseen graphs, few-shot in-context transfer, and zero-shot transfer.
arXiv Detail & Related papers (2024-02-21T09:06:31Z) - Learning Adaptive Neighborhoods for Graph Neural Networks [45.94778766867247]
Graph convolutional networks (GCNs) enable end-to-end learning on graph structured data.
We propose a novel end-to-end differentiable graph generator which builds graph topologies.
Our module can be readily integrated into existing pipelines involving graph convolution operations.
arXiv Detail & Related papers (2023-07-18T08:37:25Z) - Semantic Graph Neural Network with Multi-measure Learning for
Semi-supervised Classification [5.000404730573809]
Graph Neural Networks (GNNs) have attracted increasing attention in recent years.
Recent studies have shown that GNNs are vulnerable to the complex underlying structure of the graph.
We propose a novel framework for semi-supervised classification.
arXiv Detail & Related papers (2022-12-04T06:17:11Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Customized Graph Neural Networks [38.30640892828196]
Graph Neural Networks (GNNs) have greatly advanced the task of graph classification.
We propose a novel customized graph neural network framework, i.e., Customized-GNN.
The proposed framework is very general that can be applied to numerous existing graph neural network models.
arXiv Detail & Related papers (2020-05-22T05:22:24Z) - Adaptive Graph Auto-Encoder for General Data Clustering [90.8576971748142]
Graph-based clustering plays an important role in the clustering area.
Recent studies about graph convolution neural networks have achieved impressive success on graph type data.
We propose a graph auto-encoder for general data clustering, which constructs the graph adaptively according to the generative perspective of graphs.
arXiv Detail & Related papers (2020-02-20T10:11:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.