Differentiable Reasoning over a Virtual Knowledge Base
- URL: http://arxiv.org/abs/2002.10640v1
- Date: Tue, 25 Feb 2020 03:13:32 GMT
- Title: Differentiable Reasoning over a Virtual Knowledge Base
- Authors: Bhuwan Dhingra, Manzil Zaheer, Vidhisha Balachandran, Graham Neubig,
Ruslan Salakhutdinov, William W. Cohen
- Abstract summary: We consider the task of answering complex multi-hop questions using a corpus as a virtual knowledge base (KB)
In particular, we describe a neural module, DrKIT, that traverses textual data like a KB, softly following paths of relations between mentions of entities in the corpus.
DrKIT is very efficient, processing 10-100x more queries per second than existing multi-hop systems.
- Score: 156.94984221342716
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider the task of answering complex multi-hop questions using a corpus
as a virtual knowledge base (KB). In particular, we describe a neural module,
DrKIT, that traverses textual data like a KB, softly following paths of
relations between mentions of entities in the corpus. At each step the module
uses a combination of sparse-matrix TFIDF indices and a maximum inner product
search (MIPS) on a special index of contextual representations of the mentions.
This module is differentiable, so the full system can be trained end-to-end
using gradient based methods, starting from natural language inputs. We also
describe a pretraining scheme for the contextual representation encoder by
generating hard negative examples using existing knowledge bases. We show that
DrKIT improves accuracy by 9 points on 3-hop questions in the MetaQA dataset,
cutting the gap between text-based and KB-based state-of-the-art by 70%. On
HotpotQA, DrKIT leads to a 10% improvement over a BERT-based re-ranking
approach to retrieving the relevant passages required to answer a question.
DrKIT is also very efficient, processing 10-100x more queries per second than
existing multi-hop systems.
Related papers
- HOLMES: Hyper-Relational Knowledge Graphs for Multi-hop Question Answering using LLMs [9.559336828884808]
Large Language Models (LLMs) are adept at answering simple (single-hop) questions.
As the complexity of the questions increase, the performance of LLMs degrades.
Recent methods try to reduce this burden by integrating structured knowledge triples into the raw text.
We propose to use a knowledge graph (KG) that is context-aware and is distilled to contain query-relevant information.
arXiv Detail & Related papers (2024-06-10T05:22:49Z) - Hypergraph Enhanced Knowledge Tree Prompt Learning for Next-Basket
Recommendation [50.55786122323965]
Next-basket recommendation (NBR) aims to infer the items in the next basket given the corresponding basket sequence.
HEKP4NBR transforms the knowledge graph (KG) into prompts, namely Knowledge Tree Prompt (KTP), to help PLM encode the Out-Of-Vocabulary (OOV) item IDs.
A hypergraph convolutional module is designed to build a hypergraph based on item similarities measured by an MoE model from multiple aspects.
arXiv Detail & Related papers (2023-12-26T02:12:21Z) - UniKGQA: Unified Retrieval and Reasoning for Solving Multi-hop Question
Answering Over Knowledge Graph [89.98762327725112]
Multi-hop Question Answering over Knowledge Graph(KGQA) aims to find the answer entities that are multiple hops away from the topic entities mentioned in a natural language question.
We propose UniKGQA, a novel approach for multi-hop KGQA task, by unifying retrieval and reasoning in both model architecture and parameter learning.
arXiv Detail & Related papers (2022-12-02T04:08:09Z) - Knowledge Base Question Answering by Case-based Reasoning over Subgraphs [81.22050011503933]
We show that our model answers queries requiring complex reasoning patterns more effectively than existing KG completion algorithms.
The proposed model outperforms or performs competitively with state-of-the-art models on several KBQA benchmarks.
arXiv Detail & Related papers (2022-02-22T01:34:35Z) - SYGMA: System for Generalizable Modular Question Answering OverKnowledge
Bases [57.89642289610301]
We present SYGMA, a modular approach facilitating general-izability across multiple knowledge bases and multiple rea-soning types.
We demonstrate effectiveness of our system by evaluating on datasets belonging to two distinct knowledge bases,DBpedia and Wikidata.
arXiv Detail & Related papers (2021-09-28T01:57:56Z) - Efficient Contextualization using Top-k Operators for Question Answering
over Knowledge Graphs [24.520002698010856]
This work presents ECQA, an efficient method that prunes irrelevant parts of the search space using KB-aware signals.
Experiments with two recent QA benchmarks demonstrate the superiority of ECQA over state-of-the-art baselines with respect to answer presence, size of the search space, and runtimes.
arXiv Detail & Related papers (2021-08-19T10:06:14Z) - Answering Complex Open-Domain Questions with Multi-Hop Dense Retrieval [117.07047313964773]
We propose a simple and efficient multi-hop dense retrieval approach for answering complex open-domain questions.
Our method does not require access to any corpus-specific information, such as inter-document hyperlinks or human-annotated entity markers.
Our system also yields a much better efficiency-accuracy trade-off, matching the best published accuracy on HotpotQA while being 10 times faster at inference time.
arXiv Detail & Related papers (2020-09-27T06:12:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.