Graph Neural Architecture Search with GPT-4
- URL: http://arxiv.org/abs/2310.01436v2
- Date: Thu, 14 Mar 2024 11:50:51 GMT
- Title: Graph Neural Architecture Search with GPT-4
- Authors: Haishuai Wang, Yang Gao, Xin Zheng, Peng Zhang, Hongyang Chen, Jiajun Bu, Philip S. Yu,
- Abstract summary: We propose a new GPT-4 based Graph Neural Architecture Search method (GPT4GNAS for short)
The basic idea of our method is to design a new class of prompts for GPT-4 to guide GPT-4 toward the generative task of graph neural architectures.
By iteratively running GPT-4 with the prompts, GPT4GNAS generates more accurate graph neural networks with fast convergence.
- Score: 55.965641763959546
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Architecture Search (GNAS) has shown promising results in automatically designing graph neural networks. However, GNAS still requires intensive human labor with rich domain knowledge to design the search space and search strategy. In this paper, we integrate GPT-4 into GNAS and propose a new GPT-4 based Graph Neural Architecture Search method (GPT4GNAS for short). The basic idea of our method is to design a new class of prompts for GPT-4 to guide GPT-4 toward the generative task of graph neural architectures. The prompts consist of descriptions of the search space, search strategy, and search feedback of GNAS. By iteratively running GPT-4 with the prompts, GPT4GNAS generates more accurate graph neural networks with fast convergence. Experimental results show that embedding GPT-4 into GNAS outperforms the state-of-the-art GNAS methods.
Related papers
- Towards Lightweight Graph Neural Network Search with Curriculum Graph Sparsification [48.334100429553644]
This paper proposes to design a joint graph data and architecture mechanism, which identifies important sub-architectures via the valuable graph data.
To search for optimal lightweight Graph Neural Networks (GNNs), we propose a Lightweight Graph Neural Architecture Search with Graph SparsIfication and Network Pruning (GASSIP) method.
Our method achieves on-par or even higher node classification performance with half or fewer model parameters of searched GNNs and a sparser graph.
arXiv Detail & Related papers (2024-06-24T06:53:37Z) - MapGPT: Map-Guided Prompting with Adaptive Path Planning for Vision-and-Language Navigation [73.81268591484198]
Embodied agents equipped with GPT have exhibited extraordinary decision-making and generalization abilities across various tasks.
We present a novel map-guided GPT-based agent, dubbed MapGPT, which introduces an online linguistic-formed map to encourage global exploration.
Benefiting from this design, we propose an adaptive planning mechanism to assist the agent in performing multi-step path planning based on a map.
arXiv Detail & Related papers (2024-01-14T15:34:48Z) - Heterogeneous Graph Neural Architecture Search with GPT-4 [24.753296919178407]
Heterogeneous graph neural architecture search (HGNAS) represents a powerful tool for automatically designing effective heterogeneous graph neural networks.
Existing HGNAS algorithms suffer from inefficient searches and unstable results.
We present a new GPT-4 based HGNAS model to improve the search efficiency and search accuracy of HGNAS.
arXiv Detail & Related papers (2023-12-14T06:31:52Z) - Efficient and Explainable Graph Neural Architecture Search via
Monte-Carlo Tree Search [5.076419064097733]
Graph neural networks (GNNs) are powerful tools for performing data science tasks in various domains.
To save human efforts and computational costs, graph neural architecture search (Graph NAS) has been used to search for a sub-optimal GNN architecture.
We propose ExGNAS, which consists of (i) a simple search space that can adapt to various graphs and (ii) a search algorithm that makes the decision process explainable.
arXiv Detail & Related papers (2023-08-30T03:21:45Z) - GPT-NAS: Evolutionary Neural Architecture Search with the Generative Pre-Trained Model [25.187467297581073]
This work presents a novel architecture search algorithm, called GPT-NAS, that optimize neural architectures by Generative Pre-Trained (GPT) model.
In GPT-NAS, we assume that a generative model pre-trained on a large-scale corpus could learn the fundamental law of building neural architectures.
Our GPT-NAS method significantly outperforms seven manually designed neural architectures and thirteen architectures provided by competing NAS methods.
arXiv Detail & Related papers (2023-05-09T11:29:42Z) - Can GPT-4 Perform Neural Architecture Search? [56.98363718371614]
We investigate the potential of GPT-4 to perform Neural Architecture Search (NAS)
Our proposed approach, textbfGPT-4 textbfEnhanced textbfNeural archtextbfItecttextbfUre textbfSearch (GENIUS)
We assess GENIUS across several benchmarks, comparing it with existing state-of-the-art NAS techniques to illustrate its effectiveness.
arXiv Detail & Related papers (2023-04-21T14:06:44Z) - NAS-Bench-Graph: Benchmarking Graph Neural Architecture Search [55.75621026447599]
We propose NAS-Bench-Graph, a tailored benchmark that supports unified, reproducible, and efficient evaluations for GraphNAS.
Specifically, we construct a unified, expressive yet compact search space, covering 26,206 unique graph neural network (GNN) architectures.
Based on our proposed benchmark, the performance of GNN architectures can be directly obtained by a look-up table without any further computation.
arXiv Detail & Related papers (2022-06-18T10:17:15Z) - DFG-NAS: Deep and Flexible Graph Neural Architecture Search [27.337894841649494]
This paper proposes DFG-NAS, a new neural architecture search (NAS) method that enables the automatic search of very deep and flexible GNN architectures.
DFG-NAS highlights another level of design: the search for macro-architectures on how atomic propagation (textbftexttP) and transformation (textbftextttT) operations are integrated and organized into a GNN.
Empirical studies on four node classification tasks demonstrate that DFG-NAS outperforms state-of-the-art manual designs and NAS methods of GNNs.
arXiv Detail & Related papers (2022-06-17T06:47:21Z) - Rethinking Graph Neural Network Search from Message-passing [120.62373472087651]
This paper proposes Graph Neural Architecture Search (GNAS) with novel-designed search space.
We design Graph Neural Architecture Paradigm (GAP) with tree-topology computation procedure and two types of fine-grained atomic operations.
Experiments show that our GNAS can search for better GNNs with multiple message-passing mechanisms and optimal message-passing depth.
arXiv Detail & Related papers (2021-03-26T06:10:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.