Graph Structure Learning with Bi-level Optimization
- URL: http://arxiv.org/abs/2411.17062v1
- Date: Tue, 26 Nov 2024 03:00:30 GMT
- Title: Graph Structure Learning with Bi-level Optimization
- Authors: Nan Yin,
- Abstract summary: We propose a novel Graph Structure Learning (GSL) method to improve the robustness of graph networks (GNNs) from a global view.
We apply a generic structure extractor to transform GNNs in the form of learning structure and common parameters.
We model the learning process as a novel bi-level optimization, ie textitGeneric Structure Extraction with Bi-level Optimization for Graph Structure Learning (GSEBO)
We instantiate the proposed GSEBO on classical GNNs and compare it with the state-of-the-art GSL methods.
- Score: 2.2435959256503377
- License:
- Abstract: Currently, most Graph Structure Learning (GSL) methods, as a means of learning graph structure, improve the robustness of GNN merely from a local view by considering the local information related to each edge and indiscriminately applying the mechanism across edges, which may suffer from the local structure heterogeneity of the graph (\ie the uneven distribution of inter-class connections over nodes). To overcome the cons, we extract the graph structure as a learnable parameter and jointly learn the structure and common parameters of GNN from the global view. Excitingly, the common parameters contain the global information for nodes features mapping, which is also crucial for structure optimization (\ie optimizing the structure relies on global mapping information). Mathematically, we apply a generic structure extractor to abstract the graph structure and transform GNNs in the form of learning structure and common parameters. Then, we model the learning process as a novel bi-level optimization, \ie \textit{Generic Structure Extraction with Bi-level Optimization for Graph Structure Learning (GSEBO)}, which optimizes GNN parameters in the upper level to obtain the global mapping information and graph structure is optimized in the lower level with the global information learned from the upper level. We instantiate the proposed GSEBO on classical GNNs and compare it with the state-of-the-art GSL methods. Extensive experiments validate the effectiveness of the proposed GSEBO on four real-world datasets.
Related papers
- GaGSL: Global-augmented Graph Structure Learning via Graph Information Bottleneck [5.943641527857957]
We propose a novel method named textitGlobal-augmented Graph Structure Learning (GaGSL)
The key idea behind GaGSL is to learn a compact and informative graph structure for node classification tasks.
Comprehensive evaluations across a range of datasets reveal the outstanding performance and robustness of GaGSL compared with the state-of-the-art methods.
arXiv Detail & Related papers (2024-11-07T01:23:48Z) - Learning to Model Graph Structural Information on MLPs via Graph Structure Self-Contrasting [50.181824673039436]
We propose a Graph Structure Self-Contrasting (GSSC) framework that learns graph structural information without message passing.
The proposed framework is based purely on Multi-Layer Perceptrons (MLPs), where the structural information is only implicitly incorporated as prior knowledge.
It first applies structural sparsification to remove potentially uninformative or noisy edges in the neighborhood, and then performs structural self-contrasting in the sparsified neighborhood to learn robust node representations.
arXiv Detail & Related papers (2024-09-09T12:56:02Z) - GraphGLOW: Universal and Generalizable Structure Learning for Graph
Neural Networks [72.01829954658889]
This paper introduces the mathematical definition of this novel problem setting.
We devise a general framework that coordinates a single graph-shared structure learner and multiple graph-specific GNNs.
The well-trained structure learner can directly produce adaptive structures for unseen target graphs without any fine-tuning.
arXiv Detail & Related papers (2023-06-20T03:33:22Z) - SE-GSL: A General and Effective Graph Structure Learning Framework
through Structural Entropy Optimization [67.28453445927825]
Graph Neural Networks (GNNs) are de facto solutions to structural data learning.
Existing graph structure learning (GSL) frameworks still lack robustness and interpretability.
This paper proposes a general GSL framework, SE-GSL, through structural entropy and the graph hierarchy abstracted in the encoding tree.
arXiv Detail & Related papers (2023-03-17T05:20:24Z) - Structure-Preserving Graph Representation Learning [43.43429108503634]
We propose a novel Structure-Preserving Graph Representation Learning (SPGRL) method to fully capture the structure information of graphs.
Specifically, to reduce the uncertainty and misinformation of the original graph, we construct a feature graph as a complementary view via k-Nearest Neighbor method.
Our method has quite superior performance on semi-supervised node classification task and excellent robustness under noise perturbation on graph structure or node features.
arXiv Detail & Related papers (2022-09-02T02:49:19Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - Compact Graph Structure Learning via Mutual Information Compression [79.225671302689]
Graph Structure Learning (GSL) has attracted considerable attentions in its capacity of optimizing graph structure and learning parameters of Graph Neural Networks (GNNs)
We propose a Compact GSL architecture by MI compression, named CoGSL.
We conduct extensive experiments on several datasets under clean and attacked conditions, which demonstrate the effectiveness and robustness of CoGSL.
arXiv Detail & Related papers (2022-01-14T16:22:33Z) - Learnable Structural Semantic Readout for Graph Classification [23.78861906423389]
We propose structural semantic readout (SSRead) to summarize the node representations at the position-level.
SSRead aims to identify structurally-meaningful positions by using the semantic alignment between its nodes and structural prototypes.
Our experimental results demonstrate that SSRead significantly improves the classification performance and interpretability of GNN classifiers.
arXiv Detail & Related papers (2021-11-22T20:44:27Z) - Multi-Level Graph Convolutional Network with Automatic Graph Learning
for Hyperspectral Image Classification [63.56018768401328]
We propose a Multi-level Graph Convolutional Network (GCN) with Automatic Graph Learning method (MGCN-AGL) for HSI classification.
By employing attention mechanism to characterize the importance among spatially neighboring regions, the most relevant information can be adaptively incorporated to make decisions.
Our MGCN-AGL encodes the long range dependencies among image regions based on the expressive representations that have been produced at local level.
arXiv Detail & Related papers (2020-09-19T09:26:20Z) - Deep graph learning for semi-supervised classification [11.260083018676548]
Graph learning (GL) can dynamically capture the distribution structure (graph structure) of data based on graph convolutional networks (GCN)
Existing methods mostly combine the computational layer and the related losses into GCN for exploring the global graph or local graph.
Deep graph learning(DGL) is proposed to find the better graph representation for semi-supervised classification.
arXiv Detail & Related papers (2020-05-29T05:59:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.