LLM and GNN are Complementary: Distilling LLM for Multimodal Graph Learning
- URL: http://arxiv.org/abs/2406.01032v1
- Date: Mon, 3 Jun 2024 06:33:51 GMT
- Title: LLM and GNN are Complementary: Distilling LLM for Multimodal Graph Learning
- Authors: Junjie Xu, Zongyu Wu, Minhua Lin, Xiang Zhang, Suhang Wang,
- Abstract summary: We present an innovative framework that utilizes multimodal molecular data to extract insights from Large Language Models (LLMs)
We introduce GALLON, a framework that synergizes the capabilities of LLMs and Graph Neural Networks (GNNs) by distilling multimodal knowledge into a unified Multilayer Perceptron (MLP)
- Score: 26.980622926162933
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Recent progress in Graph Neural Networks (GNNs) has greatly enhanced the ability to model complex molecular structures for predicting properties. Nevertheless, molecular data encompasses more than just graph structures, including textual and visual information that GNNs do not handle well. To bridge this gap, we present an innovative framework that utilizes multimodal molecular data to extract insights from Large Language Models (LLMs). We introduce GALLON (Graph Learning from Large Language Model Distillation), a framework that synergizes the capabilities of LLMs and GNNs by distilling multimodal knowledge into a unified Multilayer Perceptron (MLP). This method integrates the rich textual and visual data of molecules with the structural analysis power of GNNs. Extensive experiments reveal that our distilled MLP model notably improves the accuracy and efficiency of molecular property predictions.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.