Knowledge Graph Prompting for Multi-Document Question Answering
- URL: http://arxiv.org/abs/2308.11730v3
- Date: Mon, 25 Dec 2023 17:03:05 GMT
- Title: Knowledge Graph Prompting for Multi-Document Question Answering
- Authors: Yu Wang, Nedim Lipka, Ryan A. Rossi, Alexa Siu, Ruiyi Zhang, Tyler
Derr
- Abstract summary: We propose a Knowledge Graph Prompting (KGP) method to formulate the right context in prompting multi-document question answering (MD-QA)
For graph construction, we create a knowledge graph (KG) over multiple documents with nodes symbolizing passages or document structures (e.g., pages/tables)
- Score: 46.29217406937293
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The `pre-train, prompt, predict' paradigm of large language models (LLMs) has
achieved remarkable success in open-domain question answering (OD-QA). However,
few works explore this paradigm in the scenario of multi-document question
answering (MD-QA), a task demanding a thorough understanding of the logical
associations among the contents and structures of different documents. To fill
this crucial gap, we propose a Knowledge Graph Prompting (KGP) method to
formulate the right context in prompting LLMs for MD-QA, which consists of a
graph construction module and a graph traversal module. For graph construction,
we create a knowledge graph (KG) over multiple documents with nodes symbolizing
passages or document structures (e.g., pages/tables), and edges denoting the
semantic/lexical similarity between passages or intra-document structural
relations. For graph traversal, we design an LLM-based graph traversal agent
that navigates across nodes and gathers supporting passages assisting LLMs in
MD-QA. The constructed graph serves as the global ruler that regulates the
transitional space among passages and reduces retrieval latency. Concurrently,
the graph traversal agent acts as a local navigator that gathers pertinent
context to progressively approach the question and guarantee retrieval quality.
Extensive experiments underscore the efficacy of KGP for MD-QA, signifying the
potential of leveraging graphs in enhancing the prompt design for LLMs. Our
code: https://github.com/YuWVandy/KG-LLM-MDQA.
Related papers
- NT-LLM: A Novel Node Tokenizer for Integrating Graph Structure into Large Language Models [26.739650151993928]
Graphs are a fundamental data structure for representing relationships in real-world scenarios.
Applying Large Language Models (LLMs) to graph-related tasks poses significant challenges.
We introduce Node Tokenizer for Large Language Models (NT-LLM), a novel framework that efficiently encodes graph structures.
arXiv Detail & Related papers (2024-10-14T17:21:57Z) - Let's Ask GNN: Empowering Large Language Model for Graph In-Context Learning [28.660326096652437]
We introduce AskGNN, a novel approach that bridges the gap between sequential text processing and graph-structured data.
AskGNN employs a Graph Neural Network (GNN)-powered structure-enhanced retriever to select labeled nodes across graphs.
Experiments across three tasks and seven LLMs demonstrate AskGNN's superior effectiveness in graph task performance.
arXiv Detail & Related papers (2024-10-09T17:19:12Z) - Integrating Large Language Models with Graph-based Reasoning for Conversational Question Answering [58.17090503446995]
We focus on a conversational question answering task which combines the challenges of understanding questions in context and reasoning over evidence gathered from heterogeneous sources like text, knowledge graphs, tables, and infoboxes.
Our method utilizes a graph structured representation to aggregate information about a question and its context.
arXiv Detail & Related papers (2024-06-14T13:28:03Z) - A Semantic Mention Graph Augmented Model for Document-Level Event Argument Extraction [12.286432133599355]
Document-level Event Argument Extraction (DEAE) aims to identify arguments and their specific roles from an unstructured document.
advanced approaches on DEAE utilize prompt-based methods to guide pre-trained language models (PLMs) in extracting arguments from input documents.
We propose a semantic mention Graph Augmented Model (GAM) to address these two problems in this paper.
arXiv Detail & Related papers (2024-03-12T08:58:07Z) - DocGraphLM: Documental Graph Language Model for Information Extraction [15.649726614383388]
We introduce DocGraphLM, a framework that combines pre-trained language models with graph semantics.
To achieve this, we propose 1) a joint encoder architecture to represent documents, and 2) a novel link prediction approach to reconstruct document graphs.
Our experiments on three SotA datasets show consistent improvement on IE and QA tasks with the adoption of graph features.
arXiv Detail & Related papers (2024-01-05T14:15:36Z) - ReasoningLM: Enabling Structural Subgraph Reasoning in Pre-trained
Language Models for Question Answering over Knowledge Graph [142.42275983201978]
We propose a subgraph-aware self-attention mechanism to imitate the GNN for performing structured reasoning.
We also adopt an adaptation tuning strategy to adapt the model parameters with 20,000 subgraphs with synthesized questions.
Experiments show that ReasoningLM surpasses state-of-the-art models by a large margin, even with fewer updated parameters and less training data.
arXiv Detail & Related papers (2023-12-30T07:18:54Z) - Integrating Graphs with Large Language Models: Methods and Prospects [68.37584693537555]
Large language models (LLMs) have emerged as frontrunners, showcasing unparalleled prowess in diverse applications.
Merging the capabilities of LLMs with graph-structured data has been a topic of keen interest.
This paper bifurcates such integrations into two predominant categories.
arXiv Detail & Related papers (2023-10-09T07:59:34Z) - Graph Attention with Hierarchies for Multi-hop Question Answering [19.398300844233837]
We present two extensions to the SOTA Graph Neural Network (GNN) based model for HotpotQA.
Experiments on HotpotQA demonstrate the efficiency of the proposed modifications.
arXiv Detail & Related papers (2023-01-27T15:49:50Z) - UniKGQA: Unified Retrieval and Reasoning for Solving Multi-hop Question
Answering Over Knowledge Graph [89.98762327725112]
Multi-hop Question Answering over Knowledge Graph(KGQA) aims to find the answer entities that are multiple hops away from the topic entities mentioned in a natural language question.
We propose UniKGQA, a novel approach for multi-hop KGQA task, by unifying retrieval and reasoning in both model architecture and parameter learning.
arXiv Detail & Related papers (2022-12-02T04:08:09Z) - Semantic Graphs for Generating Deep Questions [98.5161888878238]
We propose a novel framework which first constructs a semantic-level graph for the input document and then encodes the semantic graph by introducing an attention-based GGNN (Att-GGNN)
On the HotpotQA deep-question centric dataset, our model greatly improves performance over questions requiring reasoning over multiple facts, leading to state-of-the-art performance.
arXiv Detail & Related papers (2020-04-27T10:52:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.