LLM4GNAS: A Large Language Model Based Toolkit for Graph Neural Architecture Search
- URL: http://arxiv.org/abs/2502.10459v1
- Date: Wed, 12 Feb 2025 07:26:07 GMT
- Title: LLM4GNAS: A Large Language Model Based Toolkit for Graph Neural Architecture Search
- Authors: Yang Gao, Hong Yang, Yizhi Chen, Junxian Wu, Peng Zhang, Haishuai Wang,
- Abstract summary: Graph Architecture Search (GNAS) facilitates the automatic design of Graph Neural Networks (GNNs)<n>Existing GNAS approaches often require manual adaptation to new graph search spaces.<n>We present LLM4GNAS, a toolkit for GNAS that leverages the generative capabilities of Large Language Models (LLMs)
- Score: 22.684865806396104
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Neural Architecture Search (GNAS) facilitates the automatic design of Graph Neural Networks (GNNs) tailored to specific downstream graph learning tasks. However, existing GNAS approaches often require manual adaptation to new graph search spaces, necessitating substantial code optimization and domain-specific knowledge. To address this challenge, we present LLM4GNAS, a toolkit for GNAS that leverages the generative capabilities of Large Language Models (LLMs). LLM4GNAS includes an algorithm library for graph neural architecture search algorithms based on LLMs, enabling the adaptation of GNAS methods to new search spaces through the modification of LLM prompts. This approach reduces the need for manual intervention in algorithm adaptation and code modification. The LLM4GNAS toolkit is extensible and robust, incorporating LLM-enhanced graph feature engineering, LLM-enhanced graph neural architecture search, and LLM-enhanced hyperparameter optimization. Experimental results indicate that LLM4GNAS outperforms existing GNAS methods on tasks involving both homogeneous and heterogeneous graphs.
Related papers
- ABG-NAS: Adaptive Bayesian Genetic Neural Architecture Search for Graph Representation Learning [9.12960281953478]
ABG-NAS is a novel framework for automated graph neural network architecture search tailored for efficient graph representation learning.
ABG-NAS consistently outperforms both manually designed GNNs and state-of-the-art neural architecture search (NAS) methods.
arXiv Detail & Related papers (2025-04-30T01:44:27Z) - Can Large Language Models Analyze Graphs like Professionals? A Benchmark, Datasets and Models [90.98855064914379]
We introduce ProGraph, a benchmark for large language models (LLMs) to process graphs.
Our findings reveal that the performance of current LLMs is unsatisfactory, with the best model achieving only 36% accuracy.
We propose LLM4Graph datasets, which include crawled documents and auto-generated codes based on 6 widely used graph libraries.
arXiv Detail & Related papers (2024-09-29T11:38:45Z) - All Against Some: Efficient Integration of Large Language Models for Message Passing in Graph Neural Networks [51.19110891434727]
Large Language Models (LLMs) with pretrained knowledge and powerful semantic comprehension abilities have recently shown a remarkable ability to benefit applications using vision and text data.
E-LLaGNN is a framework with an on-demand LLM service that enriches message passing procedure of graph learning by enhancing a limited fraction of nodes from the graph.
arXiv Detail & Related papers (2024-07-20T22:09:42Z) - Towards Lightweight Graph Neural Network Search with Curriculum Graph Sparsification [48.334100429553644]
This paper proposes to design a joint graph data and architecture mechanism, which identifies important sub-architectures via the valuable graph data.
To search for optimal lightweight Graph Neural Networks (GNNs), we propose a Lightweight Graph Neural Architecture Search with Graph SparsIfication and Network Pruning (GASSIP) method.
Our method achieves on-par or even higher node classification performance with half or fewer model parameters of searched GNNs and a sparser graph.
arXiv Detail & Related papers (2024-06-24T06:53:37Z) - Exploring the Potential of Large Language Models in Graph Generation [51.046188600990014]
Graph generation requires large language models (LLMs) to generate graphs with given properties.
This paper explores the abilities of LLMs for graph generation with systematical task designs and experiments.
Our evaluations demonstrate that LLMs, particularly GPT-4, exhibit preliminary abilities in graph generation tasks.
arXiv Detail & Related papers (2024-03-21T12:37:54Z) - LLaGA: Large Language and Graph Assistant [73.71990472543027]
Large Language and Graph Assistant (LLaGA) is an innovative model to handle the complexities of graph-structured data.
LLaGA excels in versatility, generalizability and interpretability, allowing it to perform consistently well across different datasets and tasks.
Our experiments show that LLaGA delivers outstanding performance across four datasets and three tasks using one single model.
arXiv Detail & Related papers (2024-02-13T02:03:26Z) - Large Language Models as Topological Structure Enhancers for Text-Attributed Graphs [4.487720716313697]
Large language models (LLMs) have revolutionized the field of natural language processing (NLP)
This work explores how to leverage the information retrieval and text generation capabilities of LLMs to refine/enhance the topological structure of text-attributed graphs (TAGs) under the node classification setting.
arXiv Detail & Related papers (2023-11-24T07:53:48Z) - DFG-NAS: Deep and Flexible Graph Neural Architecture Search [27.337894841649494]
This paper proposes DFG-NAS, a new neural architecture search (NAS) method that enables the automatic search of very deep and flexible GNN architectures.
DFG-NAS highlights another level of design: the search for macro-architectures on how atomic propagation (textbftexttP) and transformation (textbftextttT) operations are integrated and organized into a GNN.
Empirical studies on four node classification tasks demonstrate that DFG-NAS outperforms state-of-the-art manual designs and NAS methods of GNNs.
arXiv Detail & Related papers (2022-06-17T06:47:21Z) - Rethinking Graph Neural Network Search from Message-passing [120.62373472087651]
This paper proposes Graph Neural Architecture Search (GNAS) with novel-designed search space.
We design Graph Neural Architecture Paradigm (GAP) with tree-topology computation procedure and two types of fine-grained atomic operations.
Experiments show that our GNAS can search for better GNNs with multiple message-passing mechanisms and optimal message-passing depth.
arXiv Detail & Related papers (2021-03-26T06:10:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.