NuTrea: Neural Tree Search for Context-guided Multi-hop KGQA
- URL: http://arxiv.org/abs/2310.15484v1
- Date: Tue, 24 Oct 2023 03:24:15 GMT
- Title: NuTrea: Neural Tree Search for Context-guided Multi-hop KGQA
- Authors: Hyeong Kyu Choi and Seunghun Lee and Jaewon Chu and Hyunwoo J. Kim
- Abstract summary: We propose a tree search-based GNN model that incorporates the broader Knowledge Graph context.
NuTrea provides a powerful means to query the KG with complex natural language questions.
- Score: 17.88589801616262
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multi-hop Knowledge Graph Question Answering (KGQA) is a task that involves
retrieving nodes from a knowledge graph (KG) to answer natural language
questions. Recent GNN-based approaches formulate this task as a KG path
searching problem, where messages are sequentially propagated from the seed
node towards the answer nodes. However, these messages are past-oriented, and
they do not consider the full KG context. To make matters worse, KG nodes often
represent proper noun entities and are sometimes encrypted, being uninformative
in selecting between paths. To address these problems, we propose Neural Tree
Search (NuTrea), a tree search-based GNN model that incorporates the broader KG
context. Our model adopts a message-passing scheme that probes the unreached
subtree regions to boost the past-oriented embeddings. In addition, we
introduce the Relation Frequency-Inverse Entity Frequency (RF-IEF) node
embedding that considers the global KG context to better characterize ambiguous
KG nodes. The general effectiveness of our approach is demonstrated through
experiments on three major multi-hop KGQA benchmark datasets, and our extensive
analyses further validate its expressiveness and robustness. Overall, NuTrea
provides a powerful means to query the KG with complex natural language
questions. Code is available at https://github.com/mlvlab/NuTrea.
Related papers
- A Prompt-Based Knowledge Graph Foundation Model for Universal In-Context Reasoning [17.676185326247946]
We propose a prompt-based KG foundation model via in-context learning, namely KG-ICL, to achieve a universal reasoning ability.
To encode prompt graphs with the generalization ability to unseen entities and relations in queries, we first propose a unified tokenizer.
Then, we propose two message passing neural networks to perform prompt encoding and KG reasoning, respectively.
arXiv Detail & Related papers (2024-10-16T06:47:18Z) - Tree-of-Traversals: A Zero-Shot Reasoning Algorithm for Augmenting Black-box Language Models with Knowledge Graphs [72.89652710634051]
Knowledge graphs (KGs) complement Large Language Models (LLMs) by providing reliable, structured, domain-specific, and up-to-date external knowledge.
We introduce Tree-of-Traversals, a novel zero-shot reasoning algorithm that enables augmentation of black-box LLMs with one or more KGs.
arXiv Detail & Related papers (2024-07-31T06:01:24Z) - GNN-RAG: Graph Neural Retrieval for Large Language Model Reasoning [21.057810495833063]
We introduce GNN-RAG, a novel method for combining language understanding abilities of LLMs with the reasoning abilities of GNNs in a retrieval-augmented generation (RAG) style.
In our GNN-RAG framework, the GNN acts as a dense subgraph reasoner to extract useful graph information.
Experiments show that GNN-RAG achieves state-of-the-art performance in two widely used KGQA benchmarks.
arXiv Detail & Related papers (2024-05-30T15:14:24Z) - Multi-hop Question Answering over Knowledge Graphs using Large Language Models [1.8130068086063336]
We evaluate the capability of (LLMs) to answer questions over Knowledge graphs that involve multiple hops.
We show that depending upon the size and nature of the KG we need different approaches to extract and feed the relevant information to an LLM.
arXiv Detail & Related papers (2024-04-30T03:31:03Z) - Generate-on-Graph: Treat LLM as both Agent and KG in Incomplete Knowledge Graph Question Answering [87.67177556994525]
We propose a training-free method called Generate-on-Graph (GoG) to generate new factual triples while exploring Knowledge Graphs (KGs)
GoG performs reasoning through a Thinking-Searching-Generating framework, which treats LLM as both Agent and KG in IKGQA.
arXiv Detail & Related papers (2024-04-23T04:47:22Z) - EntailE: Introducing Textual Entailment in Commonsense Knowledge Graph
Completion [54.12709176438264]
Commonsense knowledge graphs (CSKGs) utilize free-form text to represent named entities, short phrases, and events as their nodes.
Current methods leverage semantic similarities to increase the graph density, but the semantic plausibility of the nodes and their relations are under-explored.
We propose to adopt textual entailment to find implicit entailment relations between CSKG nodes, to effectively densify the subgraph connecting nodes within the same conceptual class.
arXiv Detail & Related papers (2024-02-15T02:27:23Z) - Knowledge Graphs Querying [4.548471481431569]
We aim at uniting different interdisciplinary topics and concepts that have been developed for KG querying.
Recent advances on KG and query embedding, multimodal KG, and KG-QA come from deep learning, IR, NLP, and computer vision domains.
arXiv Detail & Related papers (2023-05-23T19:32:42Z) - Logical Message Passing Networks with One-hop Inference on Atomic
Formulas [57.47174363091452]
We propose a framework for complex query answering that decomposes the Knowledge Graph embeddings from neural set operators.
On top of the query graph, we propose the Logical Message Passing Neural Network (LMPNN) that connects the local one-hop inferences on atomic formulas to the global logical reasoning.
Our approach yields the new state-of-the-art neural CQA model.
arXiv Detail & Related papers (2023-01-21T02:34:06Z) - Connecting the Dots: A Knowledgeable Path Generator for Commonsense
Question Answering [50.72473345911147]
This paper augments a general commonsense QA framework with a knowledgeable path generator.
By extrapolating over existing paths in a KG with a state-of-the-art language model, our generator learns to connect a pair of entities in text with a dynamic, and potentially novel, multi-hop relational path.
arXiv Detail & Related papers (2020-05-02T03:53:21Z) - Toward Subgraph-Guided Knowledge Graph Question Generation with Graph
Neural Networks [53.58077686470096]
Knowledge graph (KG) question generation (QG) aims to generate natural language questions from KGs and target answers.
In this work, we focus on a more realistic setting where we aim to generate questions from a KG subgraph and target answers.
arXiv Detail & Related papers (2020-04-13T15:43:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.