Knowledge Base Question Answering by Case-based Reasoning over Subgraphs
- URL: http://arxiv.org/abs/2202.10610v1
- Date: Tue, 22 Feb 2022 01:34:35 GMT
- Title: Knowledge Base Question Answering by Case-based Reasoning over Subgraphs
- Authors: Rajarshi Das, Ameya Godbole, Ankita Naik, Elliot Tower, Robin Jia,
Manzil Zaheer, Hannaneh Hajishirzi, Andrew McCallum
- Abstract summary: We show that our model answers queries requiring complex reasoning patterns more effectively than existing KG completion algorithms.
The proposed model outperforms or performs competitively with state-of-the-art models on several KBQA benchmarks.
- Score: 81.22050011503933
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Question answering (QA) over real-world knowledge bases (KBs) is challenging
because of the diverse (essentially unbounded) types of reasoning patterns
needed. However, we hypothesize in a large KB, reasoning patterns required to
answer a query type reoccur for various entities in their respective subgraph
neighborhoods. Leveraging this structural similarity between local
neighborhoods of different subgraphs, we introduce a semiparametric model with
(i) a nonparametric component that for each query, dynamically retrieves other
similar $k$-nearest neighbor (KNN) training queries along with query-specific
subgraphs and (ii) a parametric component that is trained to identify the
(latent) reasoning patterns from the subgraphs of KNN queries and then apply it
to the subgraph of the target query. We also propose a novel algorithm to
select a query-specific compact subgraph from within the massive knowledge
graph (KG), allowing us to scale to full Freebase KG containing billions of
edges. We show that our model answers queries requiring complex reasoning
patterns more effectively than existing KG completion algorithms. The proposed
model outperforms or performs competitively with state-of-the-art models on
several KBQA benchmarks.
Related papers
- Effective Instruction Parsing Plugin for Complex Logical Query Answering on Knowledge Graphs [51.33342412699939]
Knowledge Graph Query Embedding (KGQE) aims to embed First-Order Logic (FOL) queries in a low-dimensional KG space for complex reasoning over incomplete KGs.
Recent studies integrate various external information (such as entity types and relation context) to better capture the logical semantics of FOL queries.
We propose an effective Query Instruction Parsing (QIPP) that captures latent query patterns from code-like query instructions.
arXiv Detail & Related papers (2024-10-27T03:18:52Z) - Multi-hop Question Answering over Knowledge Graphs using Large Language Models [1.8130068086063336]
We evaluate the capability of (LLMs) to answer questions over Knowledge graphs that involve multiple hops.
We show that depending upon the size and nature of the KG we need different approaches to extract and feed the relevant information to an LLM.
arXiv Detail & Related papers (2024-04-30T03:31:03Z) - Meta Operator for Complex Query Answering on Knowledge Graphs [58.340159346749964]
We argue that different logical operator types, rather than the different complex query types, are the key to improving generalizability.
We propose a meta-learning algorithm to learn the meta-operators with limited data and adapt them to different instances of operators under various complex queries.
Empirical results show that learning meta-operators is more effective than learning original CQA or meta-CQA models.
arXiv Detail & Related papers (2024-03-15T08:54:25Z) - Prompting Large Language Models with Chain-of-Thought for Few-Shot
Knowledge Base Question Generation [19.327008532572645]
Question Generation over Knowledge Bases (KBQG) aims to convert a logical form into a natural language question.
We propose Chain-of-Thought prompting, which is an in-context learning strategy for reasoning.
We conduct extensive experiments over three public KBQG datasets.
arXiv Detail & Related papers (2023-10-12T15:08:14Z) - Logical Message Passing Networks with One-hop Inference on Atomic
Formulas [57.47174363091452]
We propose a framework for complex query answering that decomposes the Knowledge Graph embeddings from neural set operators.
On top of the query graph, we propose the Logical Message Passing Neural Network (LMPNN) that connects the local one-hop inferences on atomic formulas to the global logical reasoning.
Our approach yields the new state-of-the-art neural CQA model.
arXiv Detail & Related papers (2023-01-21T02:34:06Z) - UniKGQA: Unified Retrieval and Reasoning for Solving Multi-hop Question
Answering Over Knowledge Graph [89.98762327725112]
Multi-hop Question Answering over Knowledge Graph(KGQA) aims to find the answer entities that are multiple hops away from the topic entities mentioned in a natural language question.
We propose UniKGQA, a novel approach for multi-hop KGQA task, by unifying retrieval and reasoning in both model architecture and parameter learning.
arXiv Detail & Related papers (2022-12-02T04:08:09Z) - Query Embedding on Hyper-relational Knowledge Graphs [0.4779196219827507]
Multi-hop logical reasoning is an established problem in the field of representation learning on knowledge graphs.
We extend the multi-hop reasoning problem to hyper-relational KGs allowing to tackle this new type of complex queries.
arXiv Detail & Related papers (2021-06-15T14:08:50Z) - Robust Question Answering Through Sub-part Alignment [53.94003466761305]
We model question answering as an alignment problem.
We train our model on SQuAD v1.1 and test it on several adversarial and out-of-domain datasets.
arXiv Detail & Related papers (2020-04-30T09:10:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.