Learnable Structural Semantic Readout for Graph Classification
- URL: http://arxiv.org/abs/2111.11523v1
- Date: Mon, 22 Nov 2021 20:44:27 GMT
- Title: Learnable Structural Semantic Readout for Graph Classification
- Authors: Dongha Lee, Su Kim, Seonghyeon Lee, Chanyoung Park, Hwanjo Yu
- Abstract summary: We propose structural semantic readout (SSRead) to summarize the node representations at the position-level.
SSRead aims to identify structurally-meaningful positions by using the semantic alignment between its nodes and structural prototypes.
Our experimental results demonstrate that SSRead significantly improves the classification performance and interpretability of GNN classifiers.
- Score: 23.78861906423389
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: With the great success of deep learning in various domains, graph neural
networks (GNNs) also become a dominant approach to graph classification. By the
help of a global readout operation that simply aggregates all node (or
node-cluster) representations, existing GNN classifiers obtain a graph-level
representation of an input graph and predict its class label using the
representation. However, such global aggregation does not consider the
structural information of each node, which results in information loss on the
global structure. Particularly, it limits the discrimination power by enforcing
the same weight parameters of the classifier for all the node representations;
in practice, each of them contributes to target classes differently depending
on its structural semantic. In this work, we propose structural semantic
readout (SSRead) to summarize the node representations at the position-level,
which allows to model the position-specific weight parameters for
classification as well as to effectively capture the graph semantic relevant to
the global structure. Given an input graph, SSRead aims to identify
structurally-meaningful positions by using the semantic alignment between its
nodes and structural prototypes, which encode the prototypical features of each
position. The structural prototypes are optimized to minimize the alignment
cost for all training graphs, while the other GNN parameters are trained to
predict the class labels. Our experimental results demonstrate that SSRead
significantly improves the classification performance and interpretability of
GNN classifiers while being compatible with a variety of aggregation functions,
GNN architectures, and learning frameworks.
Related papers
- Graph Structure Learning with Bi-level Optimization [2.2435959256503377]
We propose a novel Graph Structure Learning (GSL) method to improve the robustness of graph networks (GNNs) from a global view.
We apply a generic structure extractor to transform GNNs in the form of learning structure and common parameters.
We model the learning process as a novel bi-level optimization, ie textitGeneric Structure Extraction with Bi-level Optimization for Graph Structure Learning (GSEBO)
We instantiate the proposed GSEBO on classical GNNs and compare it with the state-of-the-art GSL methods.
arXiv Detail & Related papers (2024-11-26T03:00:30Z) - Learning Invariant Representations of Graph Neural Networks via Cluster
Generalization [58.68231635082891]
Graph neural networks (GNNs) have become increasingly popular in modeling graph-structured data.
In this paper, we experimentally find that the performance of GNNs drops significantly when the structure shift happens.
We propose the Cluster Information Transfer (CIT) mechanism, which can learn invariant representations for GNNs.
arXiv Detail & Related papers (2024-03-06T10:36:56Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Local Structure-aware Graph Contrastive Representation Learning [12.554113138406688]
We propose a Local Structure-aware Graph Contrastive representation Learning method (LS-GCL) to model the structural information of nodes from multiple views.
For the local view, the semantic subgraph of each target node is input into a shared GNN encoder to obtain the target node embeddings at the subgraph-level.
For the global view, considering the original graph preserves indispensable semantic information of nodes, we leverage the shared GNN encoder to learn the target node embeddings at the global graph-level.
arXiv Detail & Related papers (2023-08-07T03:23:46Z) - Omni-Granular Ego-Semantic Propagation for Self-Supervised Graph
Representation Learning [6.128446481571702]
Unsupervised/self-supervised graph representation learning is critical for downstream node- and graph-level classification tasks.
We introduce instance-adaptive global-aware ego-semantic descriptors.
The descriptors can be explicitly integrated into local graph convolution as new neighbor nodes.
arXiv Detail & Related papers (2022-05-31T12:31:33Z) - Understanding Graph Convolutional Networks for Text Classification [9.495731689143827]
We conduct a comprehensive analysis of the role of node and edge embeddings in a graph and its GCN learning techniques in text classification.
Our analysis is the first of its kind and provides useful insights into the importance of each graph node/edge construction mechanism.
arXiv Detail & Related papers (2022-03-30T05:14:31Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Self-supervised Graph-level Representation Learning with Local and
Global Structure [71.45196938842608]
We propose a unified framework called Local-instance and Global-semantic Learning (GraphLoG) for self-supervised whole-graph representation learning.
Besides preserving the local similarities, GraphLoG introduces the hierarchical prototypes to capture the global semantic clusters.
An efficient online expectation-maximization (EM) algorithm is further developed for learning the model.
arXiv Detail & Related papers (2021-06-08T05:25:38Z) - Structure-Enhanced Meta-Learning For Few-Shot Graph Classification [53.54066611743269]
This work explores the potential of metric-based meta-learning for solving few-shot graph classification.
An implementation upon GIN, named SMFGIN, is tested on two datasets, Chembl and TRIANGLES.
arXiv Detail & Related papers (2021-03-05T09:03:03Z) - Structured Graph Learning for Clustering and Semi-supervised
Classification [74.35376212789132]
We propose a graph learning framework to preserve both the local and global structure of data.
Our method uses the self-expressiveness of samples to capture the global structure and adaptive neighbor approach to respect the local structure.
Our model is equivalent to a combination of kernel k-means and k-means methods under certain condition.
arXiv Detail & Related papers (2020-08-31T08:41:20Z) - Deep graph learning for semi-supervised classification [11.260083018676548]
Graph learning (GL) can dynamically capture the distribution structure (graph structure) of data based on graph convolutional networks (GCN)
Existing methods mostly combine the computational layer and the related losses into GCN for exploring the global graph or local graph.
Deep graph learning(DGL) is proposed to find the better graph representation for semi-supervised classification.
arXiv Detail & Related papers (2020-05-29T05:59:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.