Compact Graph Structure Learning via Mutual Information Compression
- URL: http://arxiv.org/abs/2201.05540v1
- Date: Fri, 14 Jan 2022 16:22:33 GMT
- Title: Compact Graph Structure Learning via Mutual Information Compression
- Authors: Nian Liu, Xiao Wang, Lingfei Wu, Yu Chen, Xiaojie Guo, Chuan Shi
- Abstract summary: Graph Structure Learning (GSL) has attracted considerable attentions in its capacity of optimizing graph structure and learning parameters of Graph Neural Networks (GNNs)
We propose a Compact GSL architecture by MI compression, named CoGSL.
We conduct extensive experiments on several datasets under clean and attacked conditions, which demonstrate the effectiveness and robustness of CoGSL.
- Score: 79.225671302689
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Structure Learning (GSL) recently has attracted considerable attentions
in its capacity of optimizing graph structure as well as learning suitable
parameters of Graph Neural Networks (GNNs) simultaneously. Current GSL methods
mainly learn an optimal graph structure (final view) from single or multiple
information sources (basic views), however the theoretical guidance on what is
the optimal graph structure is still unexplored. In essence, an optimal graph
structure should only contain the information about tasks while compress
redundant noise as much as possible, which is defined as "minimal sufficient
structure", so as to maintain the accurancy and robustness. How to obtain such
structure in a principled way? In this paper, we theoretically prove that if we
optimize basic views and final view based on mutual information, and keep their
performance on labels simultaneously, the final view will be a minimal
sufficient structure. With this guidance, we propose a Compact GSL architecture
by MI compression, named CoGSL. Specifically, two basic views are extracted
from original graph as two inputs of the model, which are refinedly reestimated
by a view estimator. Then, we propose an adaptive technique to fuse estimated
views into the final view. Furthermore, we maintain the performance of
estimated views and the final view and reduce the mutual information of every
two views. To comprehensively evaluate the performance of CoGSL, we conduct
extensive experiments on several datasets under clean and attacked conditions,
which demonstrate the effectiveness and robustness of CoGSL.
Related papers
- GaGSL: Global-augmented Graph Structure Learning via Graph Information Bottleneck [5.943641527857957]
We propose a novel method named textitGlobal-augmented Graph Structure Learning (GaGSL)
The key idea behind GaGSL is to learn a compact and informative graph structure for node classification tasks.
Comprehensive evaluations across a range of datasets reveal the outstanding performance and robustness of GaGSL compared with the state-of-the-art methods.
arXiv Detail & Related papers (2024-11-07T01:23:48Z) - Learning to Model Graph Structural Information on MLPs via Graph Structure Self-Contrasting [50.181824673039436]
We propose a Graph Structure Self-Contrasting (GSSC) framework that learns graph structural information without message passing.
The proposed framework is based purely on Multi-Layer Perceptrons (MLPs), where the structural information is only implicitly incorporated as prior knowledge.
It first applies structural sparsification to remove potentially uninformative or noisy edges in the neighborhood, and then performs structural self-contrasting in the sparsified neighborhood to learn robust node representations.
arXiv Detail & Related papers (2024-09-09T12:56:02Z) - SE-GSL: A General and Effective Graph Structure Learning Framework
through Structural Entropy Optimization [67.28453445927825]
Graph Neural Networks (GNNs) are de facto solutions to structural data learning.
Existing graph structure learning (GSL) frameworks still lack robustness and interpretability.
This paper proposes a general GSL framework, SE-GSL, through structural entropy and the graph hierarchy abstracted in the encoding tree.
arXiv Detail & Related papers (2023-03-17T05:20:24Z) - Self-organization Preserved Graph Structure Learning with Principle of
Relevant Information [72.83485174169027]
PRI-GSL is a Graph Structure Learning framework for identifying the self-organization and revealing the hidden structure.
PRI-GSL learns a structure that contains the most relevant yet least redundant information quantified by von Neumann entropy and Quantum Jensen-Shannon divergence.
arXiv Detail & Related papers (2022-12-30T16:02:02Z) - Semantic Graph Neural Network with Multi-measure Learning for
Semi-supervised Classification [5.000404730573809]
Graph Neural Networks (GNNs) have attracted increasing attention in recent years.
Recent studies have shown that GNNs are vulnerable to the complex underlying structure of the graph.
We propose a novel framework for semi-supervised classification.
arXiv Detail & Related papers (2022-12-04T06:17:11Z) - Self-Supervised Graph Structure Refinement for Graph Neural Networks [31.924317784535155]
Graph structure learning (GSL) aims to learn the adjacency matrix for graph neural networks (GNNs)
Most existing GSL works apply a joint learning framework where the estimated adjacency matrix and GNN parameters are optimized for downstream tasks.
We propose a graph structure refinement (GSR) framework with a pretrain-finetune pipeline.
arXiv Detail & Related papers (2022-11-12T02:01:46Z) - Structure-Preserving Graph Representation Learning [43.43429108503634]
We propose a novel Structure-Preserving Graph Representation Learning (SPGRL) method to fully capture the structure information of graphs.
Specifically, to reduce the uncertainty and misinformation of the original graph, we construct a feature graph as a complementary view via k-Nearest Neighbor method.
Our method has quite superior performance on semi-supervised node classification task and excellent robustness under noise perturbation on graph structure or node features.
arXiv Detail & Related papers (2022-09-02T02:49:19Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.