Principle of Relevant Information for Graph Sparsification
- URL: http://arxiv.org/abs/2206.00118v1
- Date: Tue, 31 May 2022 21:00:42 GMT
- Title: Principle of Relevant Information for Graph Sparsification
- Authors: Shujian Yu, Francesco Alesiani, Wenzhe Yin, Robert Jenssen, Jose C.
Principe
- Abstract summary: Graph sparsification aims to reduce the number of edges of a graph while maintaining its structural properties.
We propose the first general and effective information-theoretic formulation of graph sparsification, by taking inspiration from the Principle of Relevant Information (PRI)
We present three representative real-world applications, namely graph sparsification, graph regularized multi-task learning, and medical imaging-derived brain network classification.
- Score: 27.54740921723433
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph sparsification aims to reduce the number of edges of a graph while
maintaining its structural properties. In this paper, we propose the first
general and effective information-theoretic formulation of graph
sparsification, by taking inspiration from the Principle of Relevant
Information (PRI). To this end, we extend the PRI from a standard scalar random
variable setting to structured data (i.e., graphs). Our Graph-PRI objective is
achieved by operating on the graph Laplacian, made possible by expressing the
graph Laplacian of a subgraph in terms of a sparse edge selection vector
$\mathbf{w}$. We provide both theoretical and empirical justifications on the
validity of our Graph-PRI approach. We also analyze its analytical solutions in
a few special cases. We finally present three representative real-world
applications, namely graph sparsification, graph regularized multi-task
learning, and medical imaging-derived brain network classification, to
demonstrate the effectiveness, the versatility and the enhanced
interpretability of our approach over prevalent sparsification techniques. Code
of Graph-PRI is available at https://github.com/SJYuCNEL/PRI-Graphs
Related papers
- Subgraph Aggregation for Out-of-Distribution Generalization on Graphs [29.884717215947745]
Out-of-distribution (OOD) generalization in Graph Neural Networks (GNNs) has gained significant attention.
We propose a novel framework, SubGraph Aggregation (SuGAr), designed to learn a diverse set of subgraphs.
Experiments on both synthetic and real-world datasets demonstrate that SuGAr outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-10-29T16:54:37Z) - Explanation-Preserving Augmentation for Semi-Supervised Graph Representation Learning [13.494832603509897]
Graph representation learning (GRL) has emerged as an effective technique achieving performance improvements in wide tasks such as node classification and graph classification.
We propose a novel method, Explanation-Preserving Augmentation (EPA), that leverages graph explanation techniques for generating augmented graphs.
EPA first uses a small number of labels to train a graph explainer to infer the sub-structures (explanations) that are most relevant to a graph's semantics.
arXiv Detail & Related papers (2024-10-16T15:18:03Z) - Does Graph Prompt Work? A Data Operation Perspective with Theoretical Analysis [7.309233340654514]
This paper introduces a theoretical framework that rigorously analyzes graph prompting from a data operation perspective.
We provide a formal guarantee theorem, demonstrating graph prompts capacity to approximate graph transformation operators.
We derive upper bounds on the error of these data operations by graph prompts for a single graph and extend this discussion to batches of graphs.
arXiv Detail & Related papers (2024-10-02T15:07:13Z) - You Only Transfer What You Share: Intersection-Induced Graph Transfer
Learning for Link Prediction [79.15394378571132]
We investigate a previously overlooked phenomenon: in many cases, a densely connected, complementary graph can be found for the original graph.
The denser graph may share nodes with the original graph, which offers a natural bridge for transferring selective, meaningful knowledge.
We identify this setting as Graph Intersection-induced Transfer Learning (GITL), which is motivated by practical applications in e-commerce or academic co-authorship predictions.
arXiv Detail & Related papers (2023-02-27T22:56:06Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - Graph Condensation via Receptive Field Distribution Matching [61.71711656856704]
This paper focuses on creating a small graph to represent the original graph, so that GNNs trained on the size-reduced graph can make accurate predictions.
We view the original graph as a distribution of receptive fields and aim to synthesize a small graph whose receptive fields share a similar distribution.
arXiv Detail & Related papers (2022-06-28T02:10:05Z) - GraphCoCo: Graph Complementary Contrastive Learning [65.89743197355722]
Graph Contrastive Learning (GCL) has shown promising performance in graph representation learning (GRL) without the supervision of manual annotations.
This paper proposes an effective graph complementary contrastive learning approach named GraphCoCo to tackle the above issue.
arXiv Detail & Related papers (2022-03-24T02:58:36Z) - Unbiased Graph Embedding with Biased Graph Observations [52.82841737832561]
We propose a principled new way for obtaining unbiased representations by learning from an underlying bias-free graph.
Based on this new perspective, we propose two complementary methods for uncovering such an underlying graph.
arXiv Detail & Related papers (2021-10-26T18:44:37Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.