Distribution Knowledge Embedding for Graph Pooling
- URL: http://arxiv.org/abs/2109.14333v1
- Date: Wed, 29 Sep 2021 10:36:12 GMT
- Title: Distribution Knowledge Embedding for Graph Pooling
- Authors: Kaixuan Chen, Jie Song, Shunyu Liu, Na Yu, Zunlei Feng, Mingli Song
- Abstract summary: We argue what is crucial to graph-level downstream tasks includes not only the topological structure but also the distribution from which nodes are sampled.
powered by existing Graph Neural Networks (GNN), we propose a new plug-and-play pooling module, termed as Distribution Knowledge Embedding (DKEPool)
A DKEPool network de facto disassembles representation learning into two stages, structure learning and distribution learning.
- Score: 32.78414015096222
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph-level representation learning is the pivotal step for downstream tasks
that operate on the whole graph. The most common approach to this problem
heretofore is graph pooling, where node features are typically averaged or
summed to obtain the graph representations. However, pooling operations like
averaging or summing inevitably cause massive information missing, which may
severely downgrade the final performance. In this paper, we argue what is
crucial to graph-level downstream tasks includes not only the topological
structure but also the distribution from which nodes are sampled. Therefore,
powered by existing Graph Neural Networks (GNN), we propose a new plug-and-play
pooling module, termed as Distribution Knowledge Embedding (DKEPool), where
graphs are rephrased as distributions on top of GNNs and the pooling goal is to
summarize the entire distribution information instead of retaining a certain
feature vector by simple predefined pooling operations. A DKEPool network de
facto disassembles representation learning into two stages, structure learning
and distribution learning. Structure learning follows a recursive neighborhood
aggregation scheme to update node features where structure information is
obtained. Distribution learning, on the other hand, omits node interconnections
and focuses more on the distribution depicted by all the nodes. Extensive
experiments demonstrate that the proposed DKEPool significantly and
consistently outperforms the state-of-the-art methods.
Related papers
- MDS-GNN: A Mutual Dual-Stream Graph Neural Network on Graphs with Incomplete Features and Structure [8.00268216176428]
Graph Neural Networks (GNNs) have emerged as powerful tools for analyzing and learning representations from graph-structured data.
A crucial prerequisite for the outstanding performance of GNNs is the availability of complete graph information.
This study proposes a mutual dual-stream graph neural network (MDS-GNN) which implements a mutual benefit learning between features and structure.
arXiv Detail & Related papers (2024-08-09T03:42:56Z) - SSHPool: The Separated Subgraph-based Hierarchical Pooling [47.78745802682822]
We develop a novel local graph pooling method, namely the Separated Subgraph-based Hierarchical Pooling (SSH) for graph classification.
We individually employ the local graph convolution units as the local structure to further compress each subgraph into a coarsened node.
We develop an end-to-end GNN framework associated with the SSHPool module for graph classification.
arXiv Detail & Related papers (2024-03-24T13:03:35Z) - Deep Manifold Graph Auto-Encoder for Attributed Graph Embedding [51.75091298017941]
This paper proposes a novel Deep Manifold (Variational) Graph Auto-Encoder (DMVGAE/DMGAE) for attributed graph data.
The proposed method surpasses state-of-the-art baseline algorithms by a significant margin on different downstream tasks across popular datasets.
arXiv Detail & Related papers (2024-01-12T17:57:07Z) - Higher-order Clustering and Pooling for Graph Neural Networks [77.47617360812023]
Graph Neural Networks achieve state-of-the-art performance on a plethora of graph classification tasks.
HoscPool is a clustering-based graph pooling operator that captures higher-order information hierarchically.
We evaluate HoscPool on graph classification tasks and its clustering component on graphs with ground-truth community structure.
arXiv Detail & Related papers (2022-09-02T09:17:10Z) - LiftPool: Lifting-based Graph Pooling for Hierarchical Graph
Representation Learning [53.176603566951016]
We propose an enhanced three-stage method via lifting, named LiftPool, to improve hierarchical graph representation.
For each node to be removed, its local information is obtained by subtracting the global information aggregated from its neighboring preserved nodes.
Evaluations on benchmark graph datasets show that LiftPool substantially outperforms the state-of-the-art graph pooling methods in the task of graph classification.
arXiv Detail & Related papers (2022-04-27T12:38:02Z) - Inferential SIR-GN: Scalable Graph Representation Learning [0.4699313647907615]
Graph representation learning methods generate numerical vector representations for the nodes in a network.
In this work, we propose Inferential SIR-GN, a model which is pre-trained on random graphs, then computes node representations rapidly.
We demonstrate that the model is able to capture node's structural role information, and show excellent performance at node and graph classification tasks, on unseen networks.
arXiv Detail & Related papers (2021-11-08T20:56:37Z) - Edge but not Least: Cross-View Graph Pooling [76.71497833616024]
This paper presents a cross-view graph pooling (Co-Pooling) method to better exploit crucial graph structure information.
Through cross-view interaction, edge-view pooling and node-view pooling seamlessly reinforce each other to learn more informative graph-level representations.
arXiv Detail & Related papers (2021-09-24T08:01:23Z) - Structure-Aware Hierarchical Graph Pooling using Information Bottleneck [2.7088996845250897]
Graph pooling is an essential ingredient of Graph Neural Networks (GNNs) in graph classification and regression tasks.
We propose a novel pooling method named as HIBPool where we leverage the Information Bottleneck (IB) principle.
We also introduce a novel structure-aware Discriminative Pooling Readout (DiP-Readout) function to capture the informative local subgraph structures in the graph.
arXiv Detail & Related papers (2021-04-27T07:27:43Z) - Accurate Learning of Graph Representations with Graph Multiset Pooling [45.72542969364438]
We propose a Graph Multiset Transformer (GMT) that captures the interaction between nodes according to their structural dependencies.
Our experimental results show that GMT significantly outperforms state-of-the-art graph pooling methods on graph classification benchmarks.
arXiv Detail & Related papers (2021-02-23T07:45:58Z) - CommPOOL: An Interpretable Graph Pooling Framework for Hierarchical
Graph Representation Learning [74.90535111881358]
We propose a new interpretable graph pooling framework - CommPOOL.
It can capture and preserve the hierarchical community structure of graphs in the graph representation learning process.
CommPOOL is a general and flexible framework for hierarchical graph representation learning.
arXiv Detail & Related papers (2020-12-10T21:14:18Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.