Exploring the Potential of Large Language Models (LLMs) in Learning on
Graphs
- URL: http://arxiv.org/abs/2307.03393v4
- Date: Tue, 16 Jan 2024 16:12:14 GMT
- Title: Exploring the Potential of Large Language Models (LLMs) in Learning on
Graphs
- Authors: Zhikai Chen, Haitao Mao, Hang Li, Wei Jin, Hongzhi Wen, Xiaochi Wei,
Shuaiqiang Wang, Dawei Yin, Wenqi Fan, Hui Liu, Jiliang Tang
- Abstract summary: Large Language Models (LLMs) have been proven to possess extensive common knowledge and powerful semantic comprehension abilities.
We investigate two possible pipelines: LLMs-as-Enhancers and LLMs-as-Predictors.
- Score: 59.74814230246034
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning on Graphs has attracted immense attention due to its wide real-world
applications. The most popular pipeline for learning on graphs with textual
node attributes primarily relies on Graph Neural Networks (GNNs), and utilizes
shallow text embedding as initial node representations, which has limitations
in general knowledge and profound semantic understanding. In recent years,
Large Language Models (LLMs) have been proven to possess extensive common
knowledge and powerful semantic comprehension abilities that have
revolutionized existing workflows to handle text data. In this paper, we aim to
explore the potential of LLMs in graph machine learning, especially the node
classification task, and investigate two possible pipelines: LLMs-as-Enhancers
and LLMs-as-Predictors. The former leverages LLMs to enhance nodes' text
attributes with their massive knowledge and then generate predictions through
GNNs. The latter attempts to directly employ LLMs as standalone predictors. We
conduct comprehensive and systematical studies on these two pipelines under
various settings. From comprehensive empirical results, we make original
observations and find new insights that open new possibilities and suggest
promising directions to leverage LLMs for learning on graphs. Our codes and
datasets are available at https://github.com/CurryTang/Graph-LLM.
Related papers
- How Do Large Language Models Understand Graph Patterns? A Benchmark for Graph Pattern Comprehension [53.6373473053431]
This work introduces a benchmark to assess large language models' capabilities in graph pattern tasks.
We have developed a benchmark that evaluates whether LLMs can understand graph patterns based on either terminological or topological descriptions.
Our benchmark encompasses both synthetic and real datasets, and a variety of models, with a total of 11 tasks and 7 models.
arXiv Detail & Related papers (2024-10-04T04:48:33Z) - All Against Some: Efficient Integration of Large Language Models for Message Passing in Graph Neural Networks [51.19110891434727]
Large Language Models (LLMs) with pretrained knowledge and powerful semantic comprehension abilities have recently shown a remarkable ability to benefit applications using vision and text data.
E-LLaGNN is a framework with an on-demand LLM service that enriches message passing procedure of graph learning by enhancing a limited fraction of nodes from the graph.
arXiv Detail & Related papers (2024-07-20T22:09:42Z) - Parameter-Efficient Tuning Large Language Models for Graph Representation Learning [62.26278815157628]
We introduce Graph-aware.
Efficient Fine-Tuning - GPEFT, a novel approach for efficient graph representation learning.
We use a graph neural network (GNN) to encode structural information from neighboring nodes into a graph prompt.
We validate our approach through comprehensive experiments conducted on 8 different text-rich graphs, observing an average improvement of 2% in hit@1 and Mean Reciprocal Rank (MRR) in link prediction evaluations.
arXiv Detail & Related papers (2024-04-28T18:36:59Z) - Large Language Models on Graphs: A Comprehensive Survey [77.16803297418201]
We provide a systematic review of scenarios and techniques related to large language models on graphs.
We first summarize potential scenarios of adopting LLMs on graphs into three categories, namely pure graphs, text-attributed graphs, and text-paired graphs.
We discuss the real-world applications of such methods and summarize open-source codes and benchmark datasets.
arXiv Detail & Related papers (2023-12-05T14:14:27Z) - Large Language Models as Topological Structure Enhancers for Text-Attributed Graphs [4.487720716313697]
Large language models (LLMs) have revolutionized the field of natural language processing (NLP)
This work explores how to leverage the information retrieval and text generation capabilities of LLMs to refine/enhance the topological structure of text-attributed graphs (TAGs) under the node classification setting.
arXiv Detail & Related papers (2023-11-24T07:53:48Z) - Beyond Text: A Deep Dive into Large Language Models' Ability on
Understanding Graph Data [13.524529952170672]
Large language models (LLMs) have achieved impressive performance on many natural language processing tasks.
We aim to assess whether LLMs can effectively process graph data and leverage topological structures to enhance performance.
By comparing LLMs' performance with specialized graph models, we offer insights into the strengths and limitations of employing LLMs for graph analytics.
arXiv Detail & Related papers (2023-10-07T23:25:22Z) - Harnessing Explanations: LLM-to-LM Interpreter for Enhanced
Text-Attributed Graph Representation Learning [51.90524745663737]
A key innovation is our use of explanations as features, which can be used to boost GNN performance on downstream tasks.
Our method achieves state-of-the-art results on well-established TAG datasets.
Our method significantly speeds up training, achieving a 2.88 times improvement over the closest baseline on ogbn-arxiv.
arXiv Detail & Related papers (2023-05-31T03:18:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.