Improving Embedded Knowledge Graph Multi-hop Question Answering by
introducing Relational Chain Reasoning
- URL: http://arxiv.org/abs/2110.12679v1
- Date: Mon, 25 Oct 2021 06:53:02 GMT
- Title: Improving Embedded Knowledge Graph Multi-hop Question Answering by
introducing Relational Chain Reasoning
- Authors: Weiqiang Jin, Hang Yu, Xi Tao, Ruiping Yin
- Abstract summary: Knowledge Base Question Answer (KBQA) to answer userquestions from a knowledge base (KB) by identifying reasoning between topic entity and answer.
As a complex branchtask of KBQA, multi-hop KGQA requires reasoning over multi-hop relational chains preserved in structured KG.
- Score: 8.05076085499457
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge Base Question Answering (KBQA) aims to answer userquestions from a
knowledge base (KB) by identifying the reasoningrelations between topic entity
and answer. As a complex branchtask of KBQA, multi-hop KGQA requires reasoning
over multi-hop relational chains preserved in KG to arrive at the right
answer.Despite the successes made in recent years, the existing works
onanswering multi-hop complex question face the following challenges: i)
suffering from poor performances due to the neglect of explicit relational
chain order and its relational types reflected inuser questions; ii) failing to
consider implicit relations between thetopic entity and the answer implied in
structured KG because oflimited neighborhood size constraints in subgraph
retrieval based algorithms. To address these issues in multi-hop KGQA, we
proposea novel model in this paper, namely Relational Chain-based Embed-ded
KGQA (Rce-KGQA), which simultaneously utilizes the explicitrelational chain
described in natural language questions and the implicit relational chain
stored in structured KG. Our extensiveempirical study on two open-domain
benchmarks proves that ourmethod significantly outperforms the state-of-the-art
counterpartslike GraftNet, PullNet and EmbedKGQA. Comprehensive ablation
experiments also verify the effectiveness of our method for multi-hop KGQA
tasks. We have made our model's source code availableat Github:
https://github.com/albert-jin/Rce-KGQA.
Related papers
- Logical Message Passing Networks with One-hop Inference on Atomic
Formulas [57.47174363091452]
We propose a framework for complex query answering that decomposes the Knowledge Graph embeddings from neural set operators.
On top of the query graph, we propose the Logical Message Passing Neural Network (LMPNN) that connects the local one-hop inferences on atomic formulas to the global logical reasoning.
Our approach yields the new state-of-the-art neural CQA model.
arXiv Detail & Related papers (2023-01-21T02:34:06Z) - UniKGQA: Unified Retrieval and Reasoning for Solving Multi-hop Question
Answering Over Knowledge Graph [89.98762327725112]
Multi-hop Question Answering over Knowledge Graph(KGQA) aims to find the answer entities that are multiple hops away from the topic entities mentioned in a natural language question.
We propose UniKGQA, a novel approach for multi-hop KGQA task, by unifying retrieval and reasoning in both model architecture and parameter learning.
arXiv Detail & Related papers (2022-12-02T04:08:09Z) - Reasoning over Multi-view Knowledge Graphs [59.99051368907095]
ROMA is a novel framework for answering logical queries over multi-view KGs.
It scales up to KGs of large sizes (e.g., millions of facts) and fine-granular views.
It generalizes to query structures and KG views that are unobserved during training.
arXiv Detail & Related papers (2022-09-27T21:32:20Z) - Exploiting Hybrid Semantics of Relation Paths for Multi-hop Question
Answering Over Knowledge Graphs [31.088325888508137]
This paper proposes improving multi-hop KGQA by exploiting relation paths' hybrid semantics.
We integrate explicit textual information and implicit KG structural features of relation paths based on a novel rotate-and-scale entity link prediction framework.
arXiv Detail & Related papers (2022-09-02T08:07:37Z) - QAGCN: Answering Multi-Relation Questions via Single-Step Implicit Reasoning over Knowledge Graphs [12.354648004427824]
Multi-relation question answering (QA) is a challenging task.
Recent methods with explicit multi-step reasoning over KGs have been prominently used in this task.
We argue that multi-relation QA can be achieved via end-to-end single-step implicit reasoning.
arXiv Detail & Related papers (2022-06-03T21:01:48Z) - Knowledge Base Question Answering by Case-based Reasoning over Subgraphs [81.22050011503933]
We show that our model answers queries requiring complex reasoning patterns more effectively than existing KG completion algorithms.
The proposed model outperforms or performs competitively with state-of-the-art models on several KBQA benchmarks.
arXiv Detail & Related papers (2022-02-22T01:34:35Z) - Path-Enhanced Multi-Relational Question Answering with Knowledge Graph
Embeddings [16.21156041758793]
We propose a Path and Knowledge Embedding-Enhanced multi-relational Question Answering model (PKEEQA)
We show that PKEEQA improves KBQA models' performance for multi-relational question answering with explainability to some extent derived from paths.
arXiv Detail & Related papers (2021-10-29T08:37:46Z) - Relation-Guided Pre-Training for Open-Domain Question Answering [67.86958978322188]
We propose a Relation-Guided Pre-Training (RGPT-QA) framework to solve complex open-domain questions.
We show that RGPT-QA achieves 2.2%, 2.4%, and 6.3% absolute improvement in Exact Match accuracy on Natural Questions, TriviaQA, and WebQuestions.
arXiv Detail & Related papers (2021-09-21T17:59:31Z) - Query Embedding on Hyper-relational Knowledge Graphs [0.4779196219827507]
Multi-hop logical reasoning is an established problem in the field of representation learning on knowledge graphs.
We extend the multi-hop reasoning problem to hyper-relational KGs allowing to tackle this new type of complex queries.
arXiv Detail & Related papers (2021-06-15T14:08:50Z) - QA-GNN: Reasoning with Language Models and Knowledge Graphs for Question
Answering [122.84513233992422]
We propose a new model, QA-GNN, which addresses the problem of answering questions using knowledge from pre-trained language models (LMs) and knowledge graphs (KGs)
We show its improvement over existing LM and LM+KG models, as well as its capability to perform interpretable and structured reasoning.
arXiv Detail & Related papers (2021-04-13T17:32:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.