SE-GSL: A General and Effective Graph Structure Learning Framework
through Structural Entropy Optimization
- URL: http://arxiv.org/abs/2303.09778v1
- Date: Fri, 17 Mar 2023 05:20:24 GMT
- Title: SE-GSL: A General and Effective Graph Structure Learning Framework
through Structural Entropy Optimization
- Authors: Dongcheng Zou, Hao Peng, Xiang Huang, Renyu Yang, Jianxin Li, Jia Wu,
Chunyang Liu and Philip S. Yu
- Abstract summary: Graph Neural Networks (GNNs) are de facto solutions to structural data learning.
Existing graph structure learning (GSL) frameworks still lack robustness and interpretability.
This paper proposes a general GSL framework, SE-GSL, through structural entropy and the graph hierarchy abstracted in the encoding tree.
- Score: 67.28453445927825
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) are de facto solutions to structural data
learning. However, it is susceptible to low-quality and unreliable structure,
which has been a norm rather than an exception in real-world graphs. Existing
graph structure learning (GSL) frameworks still lack robustness and
interpretability. This paper proposes a general GSL framework, SE-GSL, through
structural entropy and the graph hierarchy abstracted in the encoding tree.
Particularly, we exploit the one-dimensional structural entropy to maximize
embedded information content when auxiliary neighbourhood attributes are fused
to enhance the original graph. A new scheme of constructing optimal encoding
trees is proposed to minimize the uncertainty and noises in the graph whilst
assuring proper community partition in hierarchical abstraction. We present a
novel sample-based mechanism for restoring the graph structure via node
structural entropy distribution. It increases the connectivity among nodes with
larger uncertainty in lower-level communities. SE-GSL is compatible with
various GNN models and enhances the robustness towards noisy and heterophily
structures. Extensive experiments show significant improvements in the
effectiveness and robustness of structure learning and node representation
learning.
Related papers
- Synergistic Deep Graph Clustering Network [14.569867830074292]
We propose a graph clustering framework named Synergistic Deep Graph Clustering Network (SynC)
In our approach, we design a Transform Input Graph Auto-Encoder (TIGAE) to obtain high-quality embeddings for guiding structure augmentation.
Notably, representation learning and structure augmentation share weights, significantly reducing the number of model parameters.
arXiv Detail & Related papers (2024-06-22T09:40:34Z) - GraphEdit: Large Language Models for Graph Structure Learning [62.618818029177355]
Graph Structure Learning (GSL) focuses on capturing intrinsic dependencies and interactions among nodes in graph-structured data.
Existing GSL methods heavily depend on explicit graph structural information as supervision signals.
We propose GraphEdit, an approach that leverages large language models (LLMs) to learn complex node relationships in graph-structured data.
arXiv Detail & Related papers (2024-02-23T08:29:42Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Self-organization Preserved Graph Structure Learning with Principle of
Relevant Information [72.83485174169027]
PRI-GSL is a Graph Structure Learning framework for identifying the self-organization and revealing the hidden structure.
PRI-GSL learns a structure that contains the most relevant yet least redundant information quantified by von Neumann entropy and Quantum Jensen-Shannon divergence.
arXiv Detail & Related papers (2022-12-30T16:02:02Z) - Semantic Graph Neural Network with Multi-measure Learning for
Semi-supervised Classification [5.000404730573809]
Graph Neural Networks (GNNs) have attracted increasing attention in recent years.
Recent studies have shown that GNNs are vulnerable to the complex underlying structure of the graph.
We propose a novel framework for semi-supervised classification.
arXiv Detail & Related papers (2022-12-04T06:17:11Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - Compact Graph Structure Learning via Mutual Information Compression [79.225671302689]
Graph Structure Learning (GSL) has attracted considerable attentions in its capacity of optimizing graph structure and learning parameters of Graph Neural Networks (GNNs)
We propose a Compact GSL architecture by MI compression, named CoGSL.
We conduct extensive experiments on several datasets under clean and attacked conditions, which demonstrate the effectiveness and robustness of CoGSL.
arXiv Detail & Related papers (2022-01-14T16:22:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.