DFG-NAS: Deep and Flexible Graph Neural Architecture Search
- URL: http://arxiv.org/abs/2206.08582v1
- Date: Fri, 17 Jun 2022 06:47:21 GMT
- Title: DFG-NAS: Deep and Flexible Graph Neural Architecture Search
- Authors: Wentao Zhang, Zheyu Lin, Yu Shen, Yang Li, Zhi Yang, Bin Cui
- Abstract summary: This paper proposes DFG-NAS, a new neural architecture search (NAS) method that enables the automatic search of very deep and flexible GNN architectures.
DFG-NAS highlights another level of design: the search for macro-architectures on how atomic propagation (textbftexttP) and transformation (textbftextttT) operations are integrated and organized into a GNN.
Empirical studies on four node classification tasks demonstrate that DFG-NAS outperforms state-of-the-art manual designs and NAS methods of GNNs.
- Score: 27.337894841649494
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) have been intensively applied to various
graph-based applications. Despite their success, manually designing the
well-behaved GNNs requires immense human expertise. And thus it is inefficient
to discover the potentially optimal data-specific GNN architecture. This paper
proposes DFG-NAS, a new neural architecture search (NAS) method that enables
the automatic search of very deep and flexible GNN architectures. Unlike most
existing methods that focus on micro-architectures, DFG-NAS highlights another
level of design: the search for macro-architectures on how atomic propagation
(\textbf{\texttt{P}}) and transformation (\textbf{\texttt{T}}) operations are
integrated and organized into a GNN. To this end, DFG-NAS proposes a novel
search space for \textbf{\texttt{P-T}} permutations and combinations based on
message-passing dis-aggregation, defines four custom-designed
macro-architecture mutations, and employs the evolutionary algorithm to conduct
an efficient and effective search. Empirical studies on four node
classification tasks demonstrate that DFG-NAS outperforms state-of-the-art
manual designs and NAS methods of GNNs.
Related papers
- Towards Lightweight Graph Neural Network Search with Curriculum Graph Sparsification [48.334100429553644]
This paper proposes to design a joint graph data and architecture mechanism, which identifies important sub-architectures via the valuable graph data.
To search for optimal lightweight Graph Neural Networks (GNNs), we propose a Lightweight Graph Neural Architecture Search with Graph SparsIfication and Network Pruning (GASSIP) method.
Our method achieves on-par or even higher node classification performance with half or fewer model parameters of searched GNNs and a sparser graph.
arXiv Detail & Related papers (2024-06-24T06:53:37Z) - Efficient and Explainable Graph Neural Architecture Search via
Monte-Carlo Tree Search [5.076419064097733]
Graph neural networks (GNNs) are powerful tools for performing data science tasks in various domains.
To save human efforts and computational costs, graph neural architecture search (Graph NAS) has been used to search for a sub-optimal GNN architecture.
We propose ExGNAS, which consists of (i) a simple search space that can adapt to various graphs and (ii) a search algorithm that makes the decision process explainable.
arXiv Detail & Related papers (2023-08-30T03:21:45Z) - Auto-HeG: Automated Graph Neural Network on Heterophilic Graphs [62.665761463233736]
We propose an automated graph neural network on heterophilic graphs, namely Auto-HeG, to automatically build heterophilic GNN models.
Specifically, Auto-HeG incorporates heterophily into all stages of automatic heterophilic graph learning, including search space design, supernet training, and architecture selection.
arXiv Detail & Related papers (2023-02-23T22:49:56Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - Search For Deep Graph Neural Networks [4.3002928862077825]
Current GNN-oriented NAS methods focus on the search for different layer aggregate components with shallow and simple architectures.
We propose a GNN generation pipeline with a novel two-stage search space, which aims at automatically generating high-performance.
Experiments on real-world datasets show that our generated GNN models outperforms existing manually designed and NAS-based ones.
arXiv Detail & Related papers (2021-09-21T09:24:59Z) - Edge-featured Graph Neural Architecture Search [131.4361207769865]
We propose Edge-featured Graph Neural Architecture Search to find the optimal GNN architecture.
Specifically, we design rich entity and edge updating operations to learn high-order representations.
We show EGNAS can search better GNNs with higher performance than current state-of-the-art human-designed and searched-based GNNs.
arXiv Detail & Related papers (2021-09-03T07:53:18Z) - Search to aggregate neighborhood for graph neural network [47.47628113034479]
We propose a framework, which tries to Search to Aggregate NEighborhood (SANE) to automatically design data-specific GNN architectures.
By designing a novel and expressive search space, we propose a differentiable search algorithm, which is more efficient than previous reinforcement learning based methods.
arXiv Detail & Related papers (2021-04-14T03:15:19Z) - Rethinking Graph Neural Network Search from Message-passing [120.62373472087651]
This paper proposes Graph Neural Architecture Search (GNAS) with novel-designed search space.
We design Graph Neural Architecture Paradigm (GAP) with tree-topology computation procedure and two types of fine-grained atomic operations.
Experiments show that our GNAS can search for better GNNs with multiple message-passing mechanisms and optimal message-passing depth.
arXiv Detail & Related papers (2021-03-26T06:10:41Z) - Simplifying Architecture Search for Graph Neural Network [38.45540097927176]
We propose SNAG framework, consisting of a novel search space and a reinforcement learning based search algorithm.
Experiments on real-world datasets demonstrate the effectiveness of SNAG framework compared to human-designed GNNs and NAS methods.
arXiv Detail & Related papers (2020-08-26T16:24:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.