MxPool: Multiplex Pooling for Hierarchical Graph Representation Learning
- URL: http://arxiv.org/abs/2004.06846v1
- Date: Wed, 15 Apr 2020 01:05:29 GMT
- Title: MxPool: Multiplex Pooling for Hierarchical Graph Representation Learning
- Authors: Yanyan Liang, Yanfeng Zhang, Dechao Gao, Qian Xu
- Abstract summary: We propose MxPool, which concurrently uses multiple graph convolution/pooling networks to build a hierarchical learning structure for graph representation learning tasks.
Our experiments on numerous graph classification benchmarks show that our MxPool has superiority over other state-of-the-art graph representation learning methods.
- Score: 7.456657747472885
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: How to utilize deep learning methods for graph classification tasks has
attracted considerable research attention in the past few years. Regarding
graph classification tasks, the graphs to be classified may have various graph
sizes (i.e., different number of nodes and edges) and have various graph
properties (e.g., average node degree, diameter, and clustering coefficient).
The diverse property of graphs has imposed significant challenges on existing
graph learning techniques since diverse graphs have different best-fit
hyperparameters. It is difficult to learn graph features from a set of diverse
graphs by a unified graph neural network. This motivates us to use a multiplex
structure in a diverse way and utilize a priori properties of graphs to guide
the learning. In this paper, we propose MxPool, which concurrently uses
multiple graph convolution/pooling networks to build a hierarchical learning
structure for graph representation learning tasks. Our experiments on numerous
graph classification benchmarks show that our MxPool has superiority over other
state-of-the-art graph representation learning methods.
Related papers
- Knowledge Probing for Graph Representation Learning [12.960185655357495]
We propose a novel graph probing framework (GraphProbe) to investigate and interpret whether the family of graph learning methods has encoded different levels of knowledge in graph representation learning.
Based on the intrinsic properties of graphs, we design three probes to systematically investigate the graph representation learning process from different perspectives.
We construct a thorough evaluation benchmark with nine representative graph learning methods from random walk based approaches, basic graph neural networks and self-supervised graph methods, and probe them on six benchmark datasets for node classification, link prediction and graph classification.
arXiv Detail & Related papers (2024-08-07T16:27:45Z) - SPGNN: Recognizing Salient Subgraph Patterns via Enhanced Graph Convolution and Pooling [25.555741218526464]
Graph neural networks (GNNs) have revolutionized the field of machine learning on non-Euclidean data such as graphs and networks.
We propose a concatenation-based graph convolution mechanism that injectively updates node representations.
We also design a novel graph pooling module, called WL-SortPool, to learn important subgraph patterns in a deep-learning manner.
arXiv Detail & Related papers (2024-04-21T13:11:59Z) - Representation learning in multiplex graphs: Where and how to fuse
information? [5.0235828656754915]
Multiplex graphs possess richer information, provide better modeling capabilities and integrate more detailed data from potentially different sources.
In this paper, we tackle the problem of learning representations for nodes in multiplex networks in an unsupervised or self-supervised manner.
We propose improvements in how to construct GNN architectures that deal with multiplex graphs.
arXiv Detail & Related papers (2024-02-27T21:47:06Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - CGMN: A Contrastive Graph Matching Network for Self-Supervised Graph
Similarity Learning [65.1042892570989]
We propose a contrastive graph matching network (CGMN) for self-supervised graph similarity learning.
We employ two strategies, namely cross-view interaction and cross-graph interaction, for effective node representation learning.
We transform node representations into graph-level representations via pooling operations for graph similarity computation.
arXiv Detail & Related papers (2022-05-30T13:20:26Z) - Edge but not Least: Cross-View Graph Pooling [76.71497833616024]
This paper presents a cross-view graph pooling (Co-Pooling) method to better exploit crucial graph structure information.
Through cross-view interaction, edge-view pooling and node-view pooling seamlessly reinforce each other to learn more informative graph-level representations.
arXiv Detail & Related papers (2021-09-24T08:01:23Z) - Second-Order Pooling for Graph Neural Networks [62.13156203025818]
We propose to use second-order pooling as graph pooling, which naturally solves the above challenges.
We show that direct use of second-order pooling with graph neural networks leads to practical problems.
We propose two novel global graph pooling methods based on second-order pooling; namely, bilinear mapping and attentional second-order pooling.
arXiv Detail & Related papers (2020-07-20T20:52:36Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.