Heterogeneous Graph Neural Architecture Search with GPT-4
- URL: http://arxiv.org/abs/2312.08680v1
- Date: Thu, 14 Dec 2023 06:31:52 GMT
- Title: Heterogeneous Graph Neural Architecture Search with GPT-4
- Authors: Haoyuan Dong, Yang Gao, Haishuai Wang, Hong Yang, Peng Zhang
- Abstract summary: Heterogeneous graph neural architecture search (HGNAS) represents a powerful tool for automatically designing effective heterogeneous graph neural networks.
Existing HGNAS algorithms suffer from inefficient searches and unstable results.
We present a new GPT-4 based HGNAS model to improve the search efficiency and search accuracy of HGNAS.
- Score: 24.753296919178407
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Heterogeneous graph neural architecture search (HGNAS) represents a powerful
tool for automatically designing effective heterogeneous graph neural networks.
However, existing HGNAS algorithms suffer from inefficient searches and
unstable results. In this paper, we present a new GPT-4 based HGNAS model to
improve the search efficiency and search accuracy of HGNAS. Specifically, we
present a new GPT-4 enhanced Heterogeneous Graph Neural Architecture Search
(GHGNAS for short). The basic idea of GHGNAS is to design a set of prompts that
can guide GPT-4 toward the task of generating new heterogeneous graph neural
architectures. By iteratively asking GPT-4 with the prompts, GHGNAS continually
validates the accuracy of the generated HGNNs and uses the feedback to further
optimize the prompts. Experimental results show that GHGNAS can design new
HGNNs by leveraging the powerful generalization capability of GPT-4. Moreover,
GHGNAS runs more effectively and stably than previous HGNAS models based on
reinforcement learning and differentiable search algorithms.
Related papers
- ABG-NAS: Adaptive Bayesian Genetic Neural Architecture Search for Graph Representation Learning [9.12960281953478]
ABG-NAS is a novel framework for automated graph neural network architecture search tailored for efficient graph representation learning.
ABG-NAS consistently outperforms both manually designed GNNs and state-of-the-art neural architecture search (NAS) methods.
arXiv Detail & Related papers (2025-04-30T01:44:27Z) - LLM4GNAS: A Large Language Model Based Toolkit for Graph Neural Architecture Search [22.684865806396104]
Graph Architecture Search (GNAS) facilitates the automatic design of Graph Neural Networks (GNNs)
Existing GNAS approaches often require manual adaptation to new graph search spaces.
We present LLM4GNAS, a toolkit for GNAS that leverages the generative capabilities of Large Language Models (LLMs)
arXiv Detail & Related papers (2025-02-12T07:26:07Z) - Towards Lightweight Graph Neural Network Search with Curriculum Graph Sparsification [48.334100429553644]
This paper proposes to design a joint graph data and architecture mechanism, which identifies important sub-architectures via the valuable graph data.
To search for optimal lightweight Graph Neural Networks (GNNs), we propose a Lightweight Graph Neural Architecture Search with Graph SparsIfication and Network Pruning (GASSIP) method.
Our method achieves on-par or even higher node classification performance with half or fewer model parameters of searched GNNs and a sparser graph.
arXiv Detail & Related papers (2024-06-24T06:53:37Z) - Graph Neural Architecture Search with GPT-4 [55.965641763959546]
We propose a new GPT-4 based Graph Neural Architecture Search method (GPT4GNAS for short)
The basic idea of our method is to design a new class of prompts for GPT-4 to guide GPT-4 toward the generative task of graph neural architectures.
By iteratively running GPT-4 with the prompts, GPT4GNAS generates more accurate graph neural networks with fast convergence.
arXiv Detail & Related papers (2023-09-30T08:05:59Z) - Efficient and Explainable Graph Neural Architecture Search via
Monte-Carlo Tree Search [5.076419064097733]
Graph neural networks (GNNs) are powerful tools for performing data science tasks in various domains.
To save human efforts and computational costs, graph neural architecture search (Graph NAS) has been used to search for a sub-optimal GNN architecture.
We propose ExGNAS, which consists of (i) a simple search space that can adapt to various graphs and (ii) a search algorithm that makes the decision process explainable.
arXiv Detail & Related papers (2023-08-30T03:21:45Z) - GPT-NAS: Evolutionary Neural Architecture Search with the Generative Pre-Trained Model [25.187467297581073]
This work presents a novel architecture search algorithm, called GPT-NAS, that optimize neural architectures by Generative Pre-Trained (GPT) model.
In GPT-NAS, we assume that a generative model pre-trained on a large-scale corpus could learn the fundamental law of building neural architectures.
Our GPT-NAS method significantly outperforms seven manually designed neural architectures and thirteen architectures provided by competing NAS methods.
arXiv Detail & Related papers (2023-05-09T11:29:42Z) - Can GPT-4 Perform Neural Architecture Search? [56.98363718371614]
We investigate the potential of GPT-4 to perform Neural Architecture Search (NAS)
Our proposed approach, textbfGPT-4 textbfEnhanced textbfNeural archtextbfItecttextbfUre textbfSearch (GENIUS)
We assess GENIUS across several benchmarks, comparing it with existing state-of-the-art NAS techniques to illustrate its effectiveness.
arXiv Detail & Related papers (2023-04-21T14:06:44Z) - Auto-HeG: Automated Graph Neural Network on Heterophilic Graphs [62.665761463233736]
We propose an automated graph neural network on heterophilic graphs, namely Auto-HeG, to automatically build heterophilic GNN models.
Specifically, Auto-HeG incorporates heterophily into all stages of automatic heterophilic graph learning, including search space design, supernet training, and architecture selection.
arXiv Detail & Related papers (2023-02-23T22:49:56Z) - DFG-NAS: Deep and Flexible Graph Neural Architecture Search [27.337894841649494]
This paper proposes DFG-NAS, a new neural architecture search (NAS) method that enables the automatic search of very deep and flexible GNN architectures.
DFG-NAS highlights another level of design: the search for macro-architectures on how atomic propagation (textbftexttP) and transformation (textbftextttT) operations are integrated and organized into a GNN.
Empirical studies on four node classification tasks demonstrate that DFG-NAS outperforms state-of-the-art manual designs and NAS methods of GNNs.
arXiv Detail & Related papers (2022-06-17T06:47:21Z) - Space4HGNN: A Novel, Modularized and Reproducible Platform to Evaluate
Heterogeneous Graph Neural Network [51.07168862821267]
We propose a unified framework covering most HGNNs, consisting of three components: heterogeneous linear transformation, heterogeneous graph transformation, and heterogeneous message passing layer.
We then build a platform Space4HGNN by defining a design space for HGNNs based on the unified framework, which offers modularized components, reproducible implementations, and standardized evaluation for HGNNs.
arXiv Detail & Related papers (2022-02-18T13:11:35Z) - Are we really making much progress? Revisiting, benchmarking, and
refining heterogeneous graph neural networks [38.15094159495419]
We present a systematical reproduction of 12 recent Heterogeneous graph neural networks (HGNNs)
We find that the simple homogeneous GNNs, e.g., GCN and GAT, are largely underestimated due to improper settings.
To facilitate robust and reproducible HGNN research, we construct the Heterogeneous Graph Benchmark (HGB)
arXiv Detail & Related papers (2021-12-30T06:29:21Z) - Rethinking Graph Neural Network Search from Message-passing [120.62373472087651]
This paper proposes Graph Neural Architecture Search (GNAS) with novel-designed search space.
We design Graph Neural Architecture Paradigm (GAP) with tree-topology computation procedure and two types of fine-grained atomic operations.
Experiments show that our GNAS can search for better GNNs with multiple message-passing mechanisms and optimal message-passing depth.
arXiv Detail & Related papers (2021-03-26T06:10:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.