Recognizing Predictive Substructures with Subgraph Information
Bottleneck
- URL: http://arxiv.org/abs/2103.11155v1
- Date: Sat, 20 Mar 2021 11:19:43 GMT
- Title: Recognizing Predictive Substructures with Subgraph Information
Bottleneck
- Authors: Junchi Yu, Tingyang Xu, Yu Rong, Yatao Bian, Junzhou Huang, Ran He
- Abstract summary: We propose a novel subgraph information bottleneck (SIB) framework to recognize such subgraphs, named IB-subgraph.
Intractability of mutual information and the discrete nature of graph data makes the objective of SIB notoriously hard to optimize.
Experiments on graph learning and large-scale point cloud tasks demonstrate the superior property of IB-subgraph.
- Score: 97.19131149357234
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The emergence of Graph Convolutional Network (GCN) has greatly boosted the
progress of graph learning. However, two disturbing factors, noise and
redundancy in graph data, and lack of interpretation for prediction results,
impede further development of GCN. One solution is to recognize a predictive
yet compressed subgraph to get rid of the noise and redundancy and obtain the
interpretable part of the graph. This setting of subgraph is similar to the
information bottleneck (IB) principle, which is less studied on
graph-structured data and GCN. Inspired by the IB principle, we propose a novel
subgraph information bottleneck (SIB) framework to recognize such subgraphs,
named IB-subgraph. However, the intractability of mutual information and the
discrete nature of graph data makes the objective of SIB notoriously hard to
optimize. To this end, we introduce a bilevel optimization scheme coupled with
a mutual information estimator for irregular graphs. Moreover, we propose a
continuous relaxation for subgraph selection with a connectivity loss for
stabilization. We further theoretically prove the error bound of our estimation
scheme for mutual information and the noise-invariant nature of IB-subgraph.
Extensive experiments on graph learning and large-scale point cloud tasks
demonstrate the superior property of IB-subgraph.
Related papers
- Subgraph Aggregation for Out-of-Distribution Generalization on Graphs [29.884717215947745]
Out-of-distribution (OOD) generalization in Graph Neural Networks (GNNs) has gained significant attention.
We propose a novel framework, SubGraph Aggregation (SuGAr), designed to learn a diverse set of subgraphs.
Experiments on both synthetic and real-world datasets demonstrate that SuGAr outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-10-29T16:54:37Z) - Towards Self-Interpretable Graph-Level Anomaly Detection [73.1152604947837]
Graph-level anomaly detection (GLAD) aims to identify graphs that exhibit notable dissimilarity compared to the majority in a collection.
We propose a Self-Interpretable Graph aNomaly dETection model ( SIGNET) that detects anomalous graphs as well as generates informative explanations simultaneously.
arXiv Detail & Related papers (2023-10-25T10:10:07Z) - Towards Explanation for Unsupervised Graph-Level Representation Learning [108.31036962735911]
Existing explanation methods focus on the supervised settings, eg, node classification and graph classification, while the explanation for unsupervised graph-level representation learning is still unexplored.
In this paper, we advance the Information Bottleneck principle (IB) to tackle the proposed explanation problem for unsupervised graph representations, which leads to a novel principle, textitUnsupervised Subgraph Information Bottleneck (USIB)
We also theoretically analyze the connection between graph representations and explanatory subgraphs on the label space, which reveals that the robustness of representations benefit the fidelity of explanatory subgraphs.
arXiv Detail & Related papers (2022-05-20T02:50:15Z) - Improving Subgraph Recognition with Variational Graph Information
Bottleneck [62.69606854404757]
Subgraph recognition aims at discovering a compressed substructure of a graph that is most informative to the graph property.
This paper introduces a noise injection method to compress the information in the subgraphs, which leads to a novel Variational Graph Information Bottleneck (VGIB) framework.
arXiv Detail & Related papers (2021-12-18T10:51:13Z) - Graph Structure Learning with Variational Information Bottleneck [70.62851953251253]
We propose a novel Variational Information Bottleneck guided Graph Structure Learning framework, namely VIB-GSL.
VIB-GSL learns an informative and compressive graph structure to distill the actionable information for specific downstream tasks.
arXiv Detail & Related papers (2021-12-16T14:22:13Z) - Graph Information Bottleneck for Subgraph Recognition [103.37499715761784]
We propose a framework of Graph Information Bottleneck (GIB) for the subgraph recognition problem in deep graph learning.
Under this framework, one can recognize the maximally informative yet compressive subgraph, named IB-subgraph.
We evaluate the properties of the IB-subgraph in three application scenarios: improvement of graph classification, graph interpretation and graph denoising.
arXiv Detail & Related papers (2020-10-12T09:32:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.