Self-organization Preserved Graph Structure Learning with Principle of
Relevant Information
- URL: http://arxiv.org/abs/2301.00015v1
- Date: Fri, 30 Dec 2022 16:02:02 GMT
- Title: Self-organization Preserved Graph Structure Learning with Principle of
Relevant Information
- Authors: Qingyun Sun, Jianxin Li, Beining Yang, Xingcheng Fu, Hao Peng, Philip
S. Yu
- Abstract summary: PRI-GSL is a Graph Structure Learning framework for identifying the self-organization and revealing the hidden structure.
PRI-GSL learns a structure that contains the most relevant yet least redundant information quantified by von Neumann entropy and Quantum Jensen-Shannon divergence.
- Score: 72.83485174169027
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most Graph Neural Networks follow the message-passing paradigm, assuming the
observed structure depicts the ground-truth node relationships. However, this
fundamental assumption cannot always be satisfied, as real-world graphs are
always incomplete, noisy, or redundant. How to reveal the inherent graph
structure in a unified way remains under-explored. We proposed PRI-GSL, a Graph
Structure Learning framework guided by the Principle of Relevant Information,
providing a simple and unified framework for identifying the self-organization
and revealing the hidden structure. PRI-GSL learns a structure that contains
the most relevant yet least redundant information quantified by von Neumann
entropy and Quantum Jensen-Shannon divergence. PRI-GSL incorporates the
evolution of quantum continuous walk with graph wavelets to encode node
structural roles, showing in which way the nodes interplay and self-organize
with the graph structure. Extensive experiments demonstrate the superior
effectiveness and robustness of PRI-GSL.
Related papers
- GaGSL: Global-augmented Graph Structure Learning via Graph Information Bottleneck [5.943641527857957]
We propose a novel method named textitGlobal-augmented Graph Structure Learning (GaGSL)
The key idea behind GaGSL is to learn a compact and informative graph structure for node classification tasks.
Comprehensive evaluations across a range of datasets reveal the outstanding performance and robustness of GaGSL compared with the state-of-the-art methods.
arXiv Detail & Related papers (2024-11-07T01:23:48Z) - Learning to Model Graph Structural Information on MLPs via Graph Structure Self-Contrasting [50.181824673039436]
We propose a Graph Structure Self-Contrasting (GSSC) framework that learns graph structural information without message passing.
The proposed framework is based purely on Multi-Layer Perceptrons (MLPs), where the structural information is only implicitly incorporated as prior knowledge.
It first applies structural sparsification to remove potentially uninformative or noisy edges in the neighborhood, and then performs structural self-contrasting in the sparsified neighborhood to learn robust node representations.
arXiv Detail & Related papers (2024-09-09T12:56:02Z) - SE-GSL: A General and Effective Graph Structure Learning Framework
through Structural Entropy Optimization [67.28453445927825]
Graph Neural Networks (GNNs) are de facto solutions to structural data learning.
Existing graph structure learning (GSL) frameworks still lack robustness and interpretability.
This paper proposes a general GSL framework, SE-GSL, through structural entropy and the graph hierarchy abstracted in the encoding tree.
arXiv Detail & Related papers (2023-03-17T05:20:24Z) - Structure-Preserving Graph Representation Learning [43.43429108503634]
We propose a novel Structure-Preserving Graph Representation Learning (SPGRL) method to fully capture the structure information of graphs.
Specifically, to reduce the uncertainty and misinformation of the original graph, we construct a feature graph as a complementary view via k-Nearest Neighbor method.
Our method has quite superior performance on semi-supervised node classification task and excellent robustness under noise perturbation on graph structure or node features.
arXiv Detail & Related papers (2022-09-02T02:49:19Z) - Compact Graph Structure Learning via Mutual Information Compression [79.225671302689]
Graph Structure Learning (GSL) has attracted considerable attentions in its capacity of optimizing graph structure and learning parameters of Graph Neural Networks (GNNs)
We propose a Compact GSL architecture by MI compression, named CoGSL.
We conduct extensive experiments on several datasets under clean and attacked conditions, which demonstrate the effectiveness and robustness of CoGSL.
arXiv Detail & Related papers (2022-01-14T16:22:33Z) - Graph Structure Learning with Variational Information Bottleneck [70.62851953251253]
We propose a novel Variational Information Bottleneck guided Graph Structure Learning framework, namely VIB-GSL.
VIB-GSL learns an informative and compressive graph structure to distill the actionable information for specific downstream tasks.
arXiv Detail & Related papers (2021-12-16T14:22:13Z) - Graph Information Bottleneck [77.21967740646784]
Graph Neural Networks (GNNs) provide an expressive way to fuse information from network structure and node features.
Inheriting from the general Information Bottleneck (IB), GIB aims to learn the minimal sufficient representation for a given task.
We show that our proposed models are more robust than state-of-the-art graph defense models.
arXiv Detail & Related papers (2020-10-24T07:13:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.