ReaRev: Adaptive Reasoning for Question Answering over Knowledge Graphs
- URL: http://arxiv.org/abs/2210.13650v1
- Date: Mon, 24 Oct 2022 23:09:52 GMT
- Title: ReaRev: Adaptive Reasoning for Question Answering over Knowledge Graphs
- Authors: Costas Mavromatis, George Karypis
- Abstract summary: Knowledge Graph Question Answering (KGQA) involves retrieving entities as answers from a Knowledge Graph (KG) using natural language queries.
We introduce a new way to KGQA reasoning with respect to both instruction decoding and execution.
Experimental results on three KGQA benchmarks demonstrate the ReaRev's effectiveness compared with previous state-of-the-art.
- Score: 12.592903558338444
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Knowledge Graph Question Answering (KGQA) involves retrieving entities as
answers from a Knowledge Graph (KG) using natural language queries. The
challenge is to learn to reason over question-relevant KG facts that traverse
KG entities and lead to the question answers. To facilitate reasoning, the
question is decoded into instructions, which are dense question representations
used to guide the KG traversals. However, if the derived instructions do not
exactly match the underlying KG information, they may lead to reasoning under
irrelevant context. Our method, termed ReaRev, introduces a new way to KGQA
reasoning with respect to both instruction decoding and execution. To improve
instruction decoding, we perform reasoning in an adaptive manner, where
KG-aware information is used to iteratively update the initial instructions. To
improve instruction execution, we emulate breadth-first search (BFS) with graph
neural networks (GNNs). The BFS strategy treats the instructions as a set and
allows our method to decide on their execution order on the fly. Experimental
results on three KGQA benchmarks demonstrate the ReaRev's effectiveness
compared with previous state-of-the-art, especially when the KG is incomplete
or when we tackle complex questions. Our code is publicly available at
https://github.com/cmavro/ReaRev_KGQA.
Related papers
- Knowledge Graph-extended Retrieval Augmented Generation for Question Answering [10.49712834719005]
This paper proposes a system that integrates Large Language Models (LLMs) and Knowledge Graphs (KGs) without requiring training.
The resulting approach can be classified as a specific form of a Retrieval Augmented Generation (RAG) with a KG.
It includes a question decomposition module to enhance multi-hop information retrieval and answerability.
arXiv Detail & Related papers (2025-04-11T18:03:02Z) - Question-Aware Knowledge Graph Prompting for Enhancing Large Language Models [51.47994645529258]
We propose Question-Aware Knowledge Graph Prompting (QAP), which incorporates question embeddings into GNN aggregation to dynamically assess KG relevance.
Experimental results demonstrate that QAP outperforms state-of-the-art methods across multiple datasets, highlighting its effectiveness.
arXiv Detail & Related papers (2025-03-30T17:09:11Z) - Decoding on Graphs: Faithful and Sound Reasoning on Knowledge Graphs through Generation of Well-Formed Chains [66.55612528039894]
Knowledge Graphs (KGs) can serve as reliable knowledge sources for question answering (QA)
We present DoG (Decoding on Graphs), a novel framework that facilitates a deep synergy between LLMs and KGs.
Experiments across various KGQA tasks with different background KGs demonstrate that DoG achieves superior and robust performance.
arXiv Detail & Related papers (2024-10-24T04:01:40Z) - A Prompt-Based Knowledge Graph Foundation Model for Universal In-Context Reasoning [17.676185326247946]
We propose a prompt-based KG foundation model via in-context learning, namely KG-ICL, to achieve a universal reasoning ability.
To encode prompt graphs with the generalization ability to unseen entities and relations in queries, we first propose a unified tokenizer.
Then, we propose two message passing neural networks to perform prompt encoding and KG reasoning, respectively.
arXiv Detail & Related papers (2024-10-16T06:47:18Z) - Generate-on-Graph: Treat LLM as both Agent and KG in Incomplete Knowledge Graph Question Answering [87.67177556994525]
We propose a training-free method called Generate-on-Graph (GoG) to generate new factual triples while exploring Knowledge Graphs (KGs)
GoG performs reasoning through a Thinking-Searching-Generating framework, which treats LLM as both Agent and KG in IKGQA.
arXiv Detail & Related papers (2024-04-23T04:47:22Z) - Open-Set Knowledge-Based Visual Question Answering with Inference Paths [79.55742631375063]
The purpose of Knowledge-Based Visual Question Answering (KB-VQA) is to provide a correct answer to the question with the aid of external knowledge bases.
We propose a new retriever-ranker paradigm of KB-VQA, Graph pATH rankER (GATHER for brevity)
Specifically, it contains graph constructing, pruning, and path-level ranking, which not only retrieves accurate answers but also provides inference paths that explain the reasoning process.
arXiv Detail & Related papers (2023-10-12T09:12:50Z) - Graph Reasoning for Question Answering with Triplet Retrieval [33.454090126152714]
We propose a simple yet effective method to retrieve the most relevant triplets from knowledge graphs (KGs)
Our method can outperform state-of-the-art up to 4.6% absolute accuracy.
arXiv Detail & Related papers (2023-05-30T04:46:28Z) - MEKER: Memory Efficient Knowledge Embedding Representation for Link
Prediction and Question Answering [65.62309538202771]
Knowledge Graphs (KGs) are symbolically structured storages of facts.
KG embedding contains concise data used in NLP tasks requiring implicit information about the real world.
We propose a memory-efficient KG embedding model, which yields SOTA-comparable performance on link prediction tasks and KG-based Question Answering.
arXiv Detail & Related papers (2022-04-22T10:47:03Z) - Improving Question Answering over Knowledge Graphs Using Graph
Summarization [0.2752817022620644]
Key idea is to represent questions and entities of a Knowledge Graph as low-dimensional embeddings.
We propose a graph summarization technique using Recurrent Convolutional Neural Network (RCNN) and GCN.
The proposed graph summarization technique can be used to tackle the issue that KGQAs cannot answer questions with an uncertain number of answers.
arXiv Detail & Related papers (2022-03-25T10:57:10Z) - QA-GNN: Reasoning with Language Models and Knowledge Graphs for Question
Answering [122.84513233992422]
We propose a new model, QA-GNN, which addresses the problem of answering questions using knowledge from pre-trained language models (LMs) and knowledge graphs (KGs)
We show its improvement over existing LM and LM+KG models, as well as its capability to perform interpretable and structured reasoning.
arXiv Detail & Related papers (2021-04-13T17:32:51Z) - Toward Subgraph-Guided Knowledge Graph Question Generation with Graph
Neural Networks [53.58077686470096]
Knowledge graph (KG) question generation (QG) aims to generate natural language questions from KGs and target answers.
In this work, we focus on a more realistic setting where we aim to generate questions from a KG subgraph and target answers.
arXiv Detail & Related papers (2020-04-13T15:43:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.