NASGEM: Neural Architecture Search via Graph Embedding Method
- URL: http://arxiv.org/abs/2007.04452v2
- Date: Mon, 28 Sep 2020 04:34:42 GMT
- Title: NASGEM: Neural Architecture Search via Graph Embedding Method
- Authors: Hsin-Pai Cheng, Tunhou Zhang, Yixing Zhang, Shiyu Li, Feng Liang, Feng
Yan, Meng Li, Vikas Chandra, Hai Li, Yiran Chen
- Abstract summary: We propose NASGEM which stands for Neural Architecture Search via Graph Embedding Method.
It is driven by a novel graph embedding method equipped with similarity measures to capture the graph topology information.
It consistently outperforms networks crafted by existing search methods in classification tasks.
- Score: 41.0658375655084
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural Architecture Search (NAS) automates and prospers the design of neural
networks. Estimator-based NAS has been proposed recently to model the
relationship between architectures and their performance to enable scalable and
flexible search. However, existing estimator-based methods encode the
architecture into a latent space without considering graph similarity. Ignoring
graph similarity in node-based search space may induce a large inconsistency
between similar graphs and their distance in the continuous encoding space,
leading to inaccurate encoding representation and/or reduced representation
capacity that can yield sub-optimal search results. To preserve graph
correlation information in encoding, we propose NASGEM which stands for Neural
Architecture Search via Graph Embedding Method. NASGEM is driven by a novel
graph embedding method equipped with similarity measures to capture the graph
topology information. By precisely estimating the graph distance and using an
auxiliary Weisfeiler-Lehman kernel to guide the encoding, NASGEM can utilize
additional structural information to get more accurate graph representation to
improve the search efficiency. GEMNet, a set of networks discovered by NASGEM,
consistently outperforms networks crafted by existing search methods in
classification tasks, i.e., with 0.4%-3.6% higher accuracy while having 11%-
21% fewer Multiply-Accumulates. We further transfer GEMNet for COCO object
detection. In both one-stage and twostage detectors, our GEMNet surpasses its
manually-crafted and automatically-searched counterparts.
Related papers
- Towards Lightweight Graph Neural Network Search with Curriculum Graph Sparsification [48.334100429553644]
This paper proposes to design a joint graph data and architecture mechanism, which identifies important sub-architectures via the valuable graph data.
To search for optimal lightweight Graph Neural Networks (GNNs), we propose a Lightweight Graph Neural Architecture Search with Graph SparsIfication and Network Pruning (GASSIP) method.
Our method achieves on-par or even higher node classification performance with half or fewer model parameters of searched GNNs and a sparser graph.
arXiv Detail & Related papers (2024-06-24T06:53:37Z) - Efficient and Explainable Graph Neural Architecture Search via
Monte-Carlo Tree Search [5.076419064097733]
Graph neural networks (GNNs) are powerful tools for performing data science tasks in various domains.
To save human efforts and computational costs, graph neural architecture search (Graph NAS) has been used to search for a sub-optimal GNN architecture.
We propose ExGNAS, which consists of (i) a simple search space that can adapt to various graphs and (ii) a search algorithm that makes the decision process explainable.
arXiv Detail & Related papers (2023-08-30T03:21:45Z) - GraphPNAS: Learning Distribution of Good Neural Architectures via Deep
Graph Generative Models [48.57083463364353]
We study neural architecture search (NAS) through the lens of learning random graph models.
We propose GraphPNAS a deep graph generative model that learns a distribution of well-performing architectures.
We show that our proposed graph generator consistently outperforms RNN-based one and achieves better or comparable performances than state-of-the-art NAS methods.
arXiv Detail & Related papers (2022-11-28T09:09:06Z) - NAS-Bench-Graph: Benchmarking Graph Neural Architecture Search [55.75621026447599]
We propose NAS-Bench-Graph, a tailored benchmark that supports unified, reproducible, and efficient evaluations for GraphNAS.
Specifically, we construct a unified, expressive yet compact search space, covering 26,206 unique graph neural network (GNN) architectures.
Based on our proposed benchmark, the performance of GNN architectures can be directly obtained by a look-up table without any further computation.
arXiv Detail & Related papers (2022-06-18T10:17:15Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - Edge-featured Graph Neural Architecture Search [131.4361207769865]
We propose Edge-featured Graph Neural Architecture Search to find the optimal GNN architecture.
Specifically, we design rich entity and edge updating operations to learn high-order representations.
We show EGNAS can search better GNNs with higher performance than current state-of-the-art human-designed and searched-based GNNs.
arXiv Detail & Related papers (2021-09-03T07:53:18Z) - Rethinking Graph Neural Network Search from Message-passing [120.62373472087651]
This paper proposes Graph Neural Architecture Search (GNAS) with novel-designed search space.
We design Graph Neural Architecture Paradigm (GAP) with tree-topology computation procedure and two types of fine-grained atomic operations.
Experiments show that our GNAS can search for better GNNs with multiple message-passing mechanisms and optimal message-passing depth.
arXiv Detail & Related papers (2021-03-26T06:10:41Z) - DiffMG: Differentiable Meta Graph Search for Heterogeneous Graph Neural
Networks [45.075163625895286]
We search for a meta graph, which can capture more complex semantic relations than a meta path, to determine how graph neural networks propagate messages along different types of edges.
We design an expressive search space in the form of a directed acyclic graph (DAG) to represent candidate meta graphs for a HIN.
We propose a novel and efficient search algorithm to make the total search cost on a par with training a single GNN once.
arXiv Detail & Related papers (2020-10-07T08:09:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.