Towards Unsupervised Deep Graph Structure Learning
- URL: http://arxiv.org/abs/2201.06367v1
- Date: Mon, 17 Jan 2022 11:57:29 GMT
- Title: Towards Unsupervised Deep Graph Structure Learning
- Authors: Yixin Liu, Yu Zheng, Daokun Zhang, Hongxu Chen, Hao Peng, Shirui Pan
- Abstract summary: We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
- Score: 67.58720734177325
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In recent years, graph neural networks (GNNs) have emerged as a successful
tool in a variety of graph-related applications. However, the performance of
GNNs can be deteriorated when noisy connections occur in the original graph
structures; besides, the dependence on explicit structures prevents GNNs from
being applied to general unstructured scenarios. To address these issues,
recently emerged deep graph structure learning (GSL) methods propose to jointly
optimize the graph structure along with GNN under the supervision of a node
classification task. Nonetheless, these methods focus on a supervised learning
scenario, which leads to several problems, i.e., the reliance on labels, the
bias of edge distribution, and the limitation on application tasks. In this
paper, we propose a more practical GSL paradigm, unsupervised graph structure
learning, where the learned graph topology is optimized by data itself without
any external guidance (i.e., labels). To solve the unsupervised GSL problem, we
propose a novel StrUcture Bootstrapping contrastive LearnIng fraMEwork (SUBLIME
for abbreviation) with the aid of self-supervised contrastive learning.
Specifically, we generate a learning target from the original data as an
"anchor graph", and use a contrastive loss to maximize the agreement between
the anchor graph and the learned graph. To provide persistent guidance, we
design a novel bootstrapping mechanism that upgrades the anchor graph with
learned structures during model learning. We also design a series of graph
learners and post-processing schemes to model the structures to learn.
Extensive experiments on eight benchmark datasets demonstrate the significant
effectiveness of our proposed SUBLIME and high quality of the optimized graphs.
Related papers
- Variational Graph Auto-Encoder Based Inductive Learning Method for Semi-Supervised Classification [10.497590357666114]
We propose the Self-Label Augmented VGAE model for inductive graph representation learning.
To leverage the label information for training, our model takes node labels as one-hot encoded inputs and then performs label reconstruction in model training.
Our proposed model archives promise results on node classification with particular superiority under semi-supervised learning settings.
arXiv Detail & Related papers (2024-03-26T08:59:37Z) - GraphEdit: Large Language Models for Graph Structure Learning [62.618818029177355]
Graph Structure Learning (GSL) focuses on capturing intrinsic dependencies and interactions among nodes in graph-structured data.
Existing GSL methods heavily depend on explicit graph structural information as supervision signals.
We propose GraphEdit, an approach that leverages large language models (LLMs) to learn complex node relationships in graph-structured data.
arXiv Detail & Related papers (2024-02-23T08:29:42Z) - A Topology-aware Graph Coarsening Framework for Continual Graph Learning [8.136809136959302]
Continual learning on graphs tackles the problem of training a graph neural network (GNN) where graph data arrive in a streaming fashion.
Traditional continual learning strategies such as Experience Replay can be adapted to streaming graphs.
We propose TA$mathbbCO$, a (t)opology-(a)ware graph (co)arsening and (co)ntinual learning framework.
arXiv Detail & Related papers (2024-01-05T22:22:13Z) - SE-GSL: A General and Effective Graph Structure Learning Framework
through Structural Entropy Optimization [67.28453445927825]
Graph Neural Networks (GNNs) are de facto solutions to structural data learning.
Existing graph structure learning (GSL) frameworks still lack robustness and interpretability.
This paper proposes a general GSL framework, SE-GSL, through structural entropy and the graph hierarchy abstracted in the encoding tree.
arXiv Detail & Related papers (2023-03-17T05:20:24Z) - Semantic Graph Neural Network with Multi-measure Learning for
Semi-supervised Classification [5.000404730573809]
Graph Neural Networks (GNNs) have attracted increasing attention in recent years.
Recent studies have shown that GNNs are vulnerable to the complex underlying structure of the graph.
We propose a novel framework for semi-supervised classification.
arXiv Detail & Related papers (2022-12-04T06:17:11Z) - GraphMAE: Self-Supervised Masked Graph Autoencoders [52.06140191214428]
We present a masked graph autoencoder GraphMAE that mitigates issues for generative self-supervised graph learning.
We conduct extensive experiments on 21 public datasets for three different graph learning tasks.
The results manifest that GraphMAE--a simple graph autoencoder with our careful designs--can consistently generate outperformance over both contrastive and generative state-of-the-art baselines.
arXiv Detail & Related papers (2022-05-22T11:57:08Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Deep Graph Structure Learning for Robust Representations: A Survey [20.564611153151834]
Graph Neural Networks (GNNs) are widely used for analyzing graph-structured data.
To improve the robustness of GNN models, many studies have been proposed around the central concept of Graph Structure Learning.
arXiv Detail & Related papers (2021-03-04T13:49:25Z) - SLAPS: Self-Supervision Improves Structure Learning for Graph Neural
Networks [14.319159694115655]
We propose the Simultaneous Learning of Adjacency and GNN Parameters with Self-supervision, or SLAPS, a method that provides more supervision for inferring a graph structure through self-supervision.
A comprehensive experimental study demonstrates that SLAPS scales to large graphs with hundreds of thousands of nodes and outperforms several models that have been proposed to learn a task-specific graph structure on established benchmarks.
arXiv Detail & Related papers (2021-02-09T18:56:01Z) - Iterative Deep Graph Learning for Graph Neural Networks: Better and
Robust Node Embeddings [53.58077686470096]
We propose an end-to-end graph learning framework, namely Iterative Deep Graph Learning (IDGL) for jointly and iteratively learning graph structure and graph embedding.
Our experiments show that our proposed IDGL models can consistently outperform or match the state-of-the-art baselines.
arXiv Detail & Related papers (2020-06-21T19:49:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.