Pooling Architecture Search for Graph Classification
- URL: http://arxiv.org/abs/2108.10587v1
- Date: Tue, 24 Aug 2021 09:03:03 GMT
- Title: Pooling Architecture Search for Graph Classification
- Authors: Lanning Wei, Huan Zhao, Quanming Yao, Zhiqiang He
- Abstract summary: Graph neural networks (GNNs) are designed to learn node-level representation based on neighborhood aggregation schemes.
Pooling methods are applied after the aggregation operation to generate coarse-grained graphs.
It is a challenging problem to design a universal pooling architecture to perform well in most cases.
We propose to use neural architecture search (NAS) to search for adaptive pooling architectures for graph classification.
- Score: 36.728077433219916
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph classification is an important problem with applications across many
domains, like chemistry and bioinformatics, for which graph neural networks
(GNNs) have been state-of-the-art (SOTA) methods. GNNs are designed to learn
node-level representation based on neighborhood aggregation schemes, and to
obtain graph-level representation, pooling methods are applied after the
aggregation operation in existing GNN models to generate coarse-grained graphs.
However,due to highly diverse applications of graph classification, and the
performance of existing pooling methods vary on different graphs. In other
words, it is a challenging problem to design a universal pooling architecture
to perform well in most cases, leading to a demand for data-specific pooling
methods in real-world applications. To address this problem, we propose to use
neural architecture search (NAS) to search for adaptive pooling architectures
for graph classification. Firstly we designed a unified framework consisting of
four modules: Aggregation, Pooling, Readout, and Merge, which can cover
existing human-designed pooling methods for graph classification. Based on this
framework, a novel search space is designed by incorporating popular operations
in human-designed architectures. Then to enable efficient search, a coarsening
strategy is proposed to continuously relax the search space, thus a
differentiable search method can be adopted. Extensive experiments on six
real-world datasets from three domains are conducted, and the results
demonstrate the effectiveness and efficiency of the proposed framework.
Related papers
- Towards Lightweight Graph Neural Network Search with Curriculum Graph Sparsification [48.334100429553644]
This paper proposes to design a joint graph data and architecture mechanism, which identifies important sub-architectures via the valuable graph data.
To search for optimal lightweight Graph Neural Networks (GNNs), we propose a Lightweight Graph Neural Architecture Search with Graph SparsIfication and Network Pruning (GASSIP) method.
Our method achieves on-par or even higher node classification performance with half or fewer model parameters of searched GNNs and a sparser graph.
arXiv Detail & Related papers (2024-06-24T06:53:37Z) - SPGNN: Recognizing Salient Subgraph Patterns via Enhanced Graph Convolution and Pooling [25.555741218526464]
Graph neural networks (GNNs) have revolutionized the field of machine learning on non-Euclidean data such as graphs and networks.
We propose a concatenation-based graph convolution mechanism that injectively updates node representations.
We also design a novel graph pooling module, called WL-SortPool, to learn important subgraph patterns in a deep-learning manner.
arXiv Detail & Related papers (2024-04-21T13:11:59Z) - Ensemble Quadratic Assignment Network for Graph Matching [52.20001802006391]
Graph matching is a commonly used technique in computer vision and pattern recognition.
Recent data-driven approaches have improved the graph matching accuracy remarkably.
We propose a graph neural network (GNN) based approach to combine the advantages of data-driven and traditional methods.
arXiv Detail & Related papers (2024-03-11T06:34:05Z) - Unsupervised Graph Neural Architecture Search with Disentangled
Self-supervision [51.88848982611515]
Unsupervised graph neural architecture search remains unexplored in the literature.
We propose a novel Disentangled Self-supervised Graph Neural Architecture Search model.
Our model is able to achieve state-of-the-art performance against several baseline methods in an unsupervised manner.
arXiv Detail & Related papers (2024-03-08T05:23:55Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - Learn Layer-wise Connections in Graph Neural Networks [12.363386808994079]
We propose a framework LLC (Learn Layer-wise Connections) based on neural architecture search (NAS) to learn adaptive connections among intermediate layers in GNNs.
LLC contains one novel search space which consists of 3 types of blocks and learnable connections, and one differentiable search algorithm to enable the efficient search process.
Extensive experiments on five real-world datasets are conducted, and the results show that the searched layer-wise connections can not only improve the performance but also alleviate the over-smoothing problem.
arXiv Detail & Related papers (2021-12-27T09:33:22Z) - Edge-featured Graph Neural Architecture Search [131.4361207769865]
We propose Edge-featured Graph Neural Architecture Search to find the optimal GNN architecture.
Specifically, we design rich entity and edge updating operations to learn high-order representations.
We show EGNAS can search better GNNs with higher performance than current state-of-the-art human-designed and searched-based GNNs.
arXiv Detail & Related papers (2021-09-03T07:53:18Z) - DiffMG: Differentiable Meta Graph Search for Heterogeneous Graph Neural
Networks [45.075163625895286]
We search for a meta graph, which can capture more complex semantic relations than a meta path, to determine how graph neural networks propagate messages along different types of edges.
We design an expressive search space in the form of a directed acyclic graph (DAG) to represent candidate meta graphs for a HIN.
We propose a novel and efficient search algorithm to make the total search cost on a par with training a single GNN once.
arXiv Detail & Related papers (2020-10-07T08:09:29Z) - Policy-GNN: Aggregation Optimization for Graph Neural Networks [60.50932472042379]
Graph neural networks (GNNs) aim to model the local graph structures and capture the hierarchical patterns by aggregating the information from neighbors.
It is a challenging task to develop an effective aggregation strategy for each node, given complex graphs and sparse features.
We propose Policy-GNN, a meta-policy framework that models the sampling procedure and message passing of GNNs into a combined learning process.
arXiv Detail & Related papers (2020-06-26T17:03:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.