Knowledge Graph-Guided Retrieval Augmented Generation
- URL: http://arxiv.org/abs/2502.06864v1
- Date: Sat, 08 Feb 2025 02:14:31 GMT
- Title: Knowledge Graph-Guided Retrieval Augmented Generation
- Authors: Xiangrong Zhu, Yuexiang Xie, Yi Liu, Yaliang Li, Wei Hu,
- Abstract summary: We propose a Knowledge Graph-Guided Retrieval Augmented Generation framework.
KG$2$RAG provides fact-level relationships between chunks, improving the diversity and coherence of the retrieved results.
- Score: 34.83235788116369
- License:
- Abstract: Retrieval-augmented generation (RAG) has emerged as a promising technology for addressing hallucination issues in the responses generated by large language models (LLMs). Existing studies on RAG primarily focus on applying semantic-based approaches to retrieve isolated relevant chunks, which ignore their intrinsic relationships. In this paper, we propose a novel Knowledge Graph-Guided Retrieval Augmented Generation (KG$^2$RAG) framework that utilizes knowledge graphs (KGs) to provide fact-level relationships between chunks, improving the diversity and coherence of the retrieved results. Specifically, after performing a semantic-based retrieval to provide seed chunks, KG$^2$RAG employs a KG-guided chunk expansion process and a KG-based chunk organization process to deliver relevant and important knowledge in well-organized paragraphs. Extensive experiments conducted on the HotpotQA dataset and its variants demonstrate the advantages of KG$^2$RAG compared to existing RAG-based approaches, in terms of both response quality and retrieval quality.
Related papers
- ArchRAG: Attributed Community-based Hierarchical Retrieval-Augmented Generation [16.204046295248546]
Retrieval-Augmented Generation (RAG) has proven effective in integrating external knowledge into large language models.
We introduce a novel graph-based RAG approach, called Attributed Community-based Hierarchical RAG (ArchRAG)
We build a novel hierarchical index structure for the attributed communities and develop an effective online retrieval method.
arXiv Detail & Related papers (2025-02-14T03:28:36Z) - GFM-RAG: Graph Foundation Model for Retrieval Augmented Generation [84.41557981816077]
We introduce GFM-RAG, a novel graph foundation model (GFM) for retrieval augmented generation.
GFM-RAG is powered by an innovative graph neural network that reasons over graph structure to capture complex query-knowledge relationships.
It achieves state-of-the-art performance while maintaining efficiency and alignment with neural scaling laws.
arXiv Detail & Related papers (2025-02-03T07:04:29Z) - Chain-of-Retrieval Augmented Generation [72.06205327186069]
This paper introduces an approach for training o1-like RAG models that retrieve and reason over relevant information step by step before generating the final answer.
Our proposed method, CoRAG, allows the model to dynamically reformulate the query based on the evolving state.
arXiv Detail & Related papers (2025-01-24T09:12:52Z) - Ontology-grounded Automatic Knowledge Graph Construction by LLM under Wikidata schema [60.42231674887294]
We propose an ontology-grounded approach to Knowledge Graph (KG) construction using Large Language Models (LLMs) on a knowledge base.
We ground generation of KG with the authored ontology based on extracted relations to ensure consistency and interpretability.
Our work presents a promising direction for scalable KG construction pipeline with minimal human intervention, that yields high quality and human-interpretable KGs.
arXiv Detail & Related papers (2024-12-30T13:36:05Z) - SimGRAG: Leveraging Similar Subgraphs for Knowledge Graphs Driven Retrieval-Augmented Generation [6.568733377722896]
We propose a novel Similar Graph Enhanced Retrieval-Augmented Generation (SimGRAG) method.
It effectively addresses the challenge of aligning query texts and knowledge graphs.
SimGRAG outperforms state-of-the-art KG-driven RAG methods in question answering and fact verification.
arXiv Detail & Related papers (2024-12-17T15:40:08Z) - Distill-SynthKG: Distilling Knowledge Graph Synthesis Workflow for Improved Coverage and Efficiency [59.6772484292295]
Knowledge graphs (KGs) generated by large language models (LLMs) are increasingly valuable for Retrieval-Augmented Generation (RAG) applications.
Existing KG extraction methods rely on prompt-based approaches, which are inefficient for processing large-scale corpora.
We propose SynthKG, a multi-step, document-level synthesis KG workflow based on LLMs.
We also design a novel graph-based retrieval framework for RAG.
arXiv Detail & Related papers (2024-10-22T00:47:54Z) - GEM-RAG: Graphical Eigen Memories For Retrieval Augmented Generation [3.2027710059627545]
We introduce Graphical Eigen Memories For Retrieval Augmented Generation (GEM-RAG)
GEM-RAG works by tagging each chunk of text in a given text corpus with LLM generated utility'' questions.
We evaluate GEM-RAG, using both UnifiedQA and GPT-3.5 Turbo as the LLMs, with SBERT, and OpenAI's text encoders on two standard QA tasks.
arXiv Detail & Related papers (2024-09-23T21:42:47Z) - PRAGyan -- Connecting the Dots in Tweets [0.0]
This research explores the integration of Knowledge Graphs (KGs) with Large Language Models (LLMs) to perform causal analysis of tweets dataset.
We employ a Retrieval-Augmented Generation (RAG) model, utilizing a KG stored in a Neo4j (a.k.a PRAGyan) data format, to retrieve relevant context for causal reasoning.
arXiv Detail & Related papers (2024-07-18T21:49:32Z) - KG-RAG: Bridging the Gap Between Knowledge and Creativity [0.0]
Large Language Model Agents (LMAs) face issues such as information hallucinations, catastrophic forgetting, and limitations in processing long contexts.
This paper introduces a KG-RAG (Knowledge Graph-Retrieval Augmented Generation) pipeline to enhance the knowledge capabilities of LMAs.
Preliminary experiments on the ComplexWebQuestions dataset demonstrate notable improvements in the reduction of hallucinated content.
arXiv Detail & Related papers (2024-05-20T14:03:05Z) - Knowledge Graph-Augmented Language Models for Knowledge-Grounded
Dialogue Generation [58.65698688443091]
We propose SUbgraph Retrieval-augmented GEneration (SURGE), a framework for generating context-relevant and knowledge-grounded dialogues with Knowledge Graphs (KGs)
Our framework first retrieves the relevant subgraph from the KG, and then enforces consistency across facts by perturbing their word embeddings conditioned by the retrieved subgraph.
We validate our SURGE framework on OpendialKG and KOMODIS datasets, showing that it generates high-quality dialogues that faithfully reflect the knowledge from KG.
arXiv Detail & Related papers (2023-05-30T08:36:45Z) - DSKReG: Differentiable Sampling on Knowledge Graph for Recommendation
with Relational GNN [59.160401038969795]
We propose differentiable sampling on Knowledge Graph for Recommendation with GNN (DSKReG)
We devise a differentiable sampling strategy, which enables the selection of relevant items to be jointly optimized with the model training procedure.
The experimental results demonstrate that our model outperforms state-of-the-art KG-based recommender systems.
arXiv Detail & Related papers (2021-08-26T16:19:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.