Query2Triple: Unified Query Encoding for Answering Diverse Complex
Queries over Knowledge Graphs
- URL: http://arxiv.org/abs/2310.11246v1
- Date: Tue, 17 Oct 2023 13:13:30 GMT
- Title: Query2Triple: Unified Query Encoding for Answering Diverse Complex
Queries over Knowledge Graphs
- Authors: Yao Xu, Shizhu He, Cunguang Wang, Li Cai, Kang Liu, Jun Zhao
- Abstract summary: We propose Query to Triple (Q2T), a novel approach that decouples the training for simple and complex queries.
Our proposed Q2T is not only efficient to train, but also modular, thus easily adaptable to various neural link predictors.
- Score: 29.863085746761556
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Complex Query Answering (CQA) is a challenge task of Knowledge Graph (KG).
Due to the incompleteness of KGs, query embedding (QE) methods have been
proposed to encode queries and entities into the same embedding space, and
treat logical operators as neural set operators to obtain answers. However,
these methods train KG embeddings and neural set operators concurrently on both
simple (one-hop) and complex (multi-hop and logical) queries, which causes
performance degradation on simple queries and low training efficiency. In this
paper, we propose Query to Triple (Q2T), a novel approach that decouples the
training for simple and complex queries. Q2T divides the training into two
stages: (1) Pre-training a neural link predictor on simple queries to predict
tail entities based on the head entity and relation. (2) Training a query
encoder on complex queries to encode diverse complex queries into a unified
triple form that can be efficiently solved by the pretrained neural link
predictor. Our proposed Q2T is not only efficient to train, but also modular,
thus easily adaptable to various neural link predictors that have been studied
well. Extensive experiments demonstrate that, even without explicit modeling
for neural set operators, Q2T still achieves state-of-the-art performance on
diverse complex queries over three public benchmarks.
Related papers
- Effective Instruction Parsing Plugin for Complex Logical Query Answering on Knowledge Graphs [51.33342412699939]
Knowledge Graph Query Embedding (KGQE) aims to embed First-Order Logic (FOL) queries in a low-dimensional KG space for complex reasoning over incomplete KGs.
Recent studies integrate various external information (such as entity types and relation context) to better capture the logical semantics of FOL queries.
We propose an effective Query Instruction Parsing (QIPP) that captures latent query patterns from code-like query instructions.
arXiv Detail & Related papers (2024-10-27T03:18:52Z) - Adaptive-RAG: Learning to Adapt Retrieval-Augmented Large Language Models through Question Complexity [59.57065228857247]
Retrieval-augmented Large Language Models (LLMs) have emerged as a promising approach to enhancing response accuracy in several tasks, such as Question-Answering (QA)
We propose a novel adaptive QA framework, that can dynamically select the most suitable strategy for (retrieval-augmented) LLMs based on the query complexity.
We validate our model on a set of open-domain QA datasets, covering multiple query complexities, and show that ours enhances the overall efficiency and accuracy of QA systems.
arXiv Detail & Related papers (2024-03-21T13:52:30Z) - Meta Operator for Complex Query Answering on Knowledge Graphs [58.340159346749964]
We argue that different logical operator types, rather than the different complex query types, are the key to improving generalizability.
We propose a meta-learning algorithm to learn the meta-operators with limited data and adapt them to different instances of operators under various complex queries.
Empirical results show that learning meta-operators is more effective than learning original CQA or meta-CQA models.
arXiv Detail & Related papers (2024-03-15T08:54:25Z) - Type-based Neural Link Prediction Adapter for Complex Query Answering [2.1098688291287475]
We propose TypE-based Neural Link Prediction Adapter (TENLPA), a novel model that constructs type-based entity-relation graphs.
In order to effectively combine type information with complex logical queries, an adaptive learning mechanism is introduced.
Experiments on 3 standard datasets show that TENLPA model achieves state-of-the-art performance on complex query answering.
arXiv Detail & Related papers (2024-01-29T10:54:28Z) - Sequential Query Encoding For Complex Query Answering on Knowledge
Graphs [31.40820604209387]
We propose sequential query encoding (SQE) as an alternative to encode queries for knowledge graph (KG) reasoning.
SQE first uses a search-based algorithm to linearize the computational graph to a sequence of tokens and then uses a sequence encoder to compute its vector representation.
Despite its simplicity, SQE demonstrates state-of-the-art neural query encoding performance on FB15k, FB15k-237, and NELL.
arXiv Detail & Related papers (2023-02-25T16:33:53Z) - Adapting Neural Link Predictors for Data-Efficient Complex Query
Answering [45.961111441411084]
We propose a parameter-efficient score emphadaptation model optimised to re-calibrate neural link prediction scores for the complex query answering task.
CQD$mathcalA$ produces significantly more accurate results than current state-of-the-art methods.
arXiv Detail & Related papers (2023-01-29T00:17:16Z) - Logical Message Passing Networks with One-hop Inference on Atomic
Formulas [57.47174363091452]
We propose a framework for complex query answering that decomposes the Knowledge Graph embeddings from neural set operators.
On top of the query graph, we propose the Logical Message Passing Neural Network (LMPNN) that connects the local one-hop inferences on atomic formulas to the global logical reasoning.
Our approach yields the new state-of-the-art neural CQA model.
arXiv Detail & Related papers (2023-01-21T02:34:06Z) - UniKGQA: Unified Retrieval and Reasoning for Solving Multi-hop Question
Answering Over Knowledge Graph [89.98762327725112]
Multi-hop Question Answering over Knowledge Graph(KGQA) aims to find the answer entities that are multiple hops away from the topic entities mentioned in a natural language question.
We propose UniKGQA, a novel approach for multi-hop KGQA task, by unifying retrieval and reasoning in both model architecture and parameter learning.
arXiv Detail & Related papers (2022-12-02T04:08:09Z) - NQE: N-ary Query Embedding for Complex Query Answering over
Hyper-Relational Knowledge Graphs [1.415350927301928]
Complex query answering is an essential task for logical reasoning on knowledge graphs.
We propose a novel N-ary Query Embedding (NQE) model for CQA over hyper-relational knowledge graphs (HKGs)
NQE utilizes a dual-heterogeneous Transformer encoder and fuzzy logic theory to satisfy all n-ary FOL queries.
We generate a new CQA dataset WD50K-NFOL, including diverse n-ary FOL queries over WD50K.
arXiv Detail & Related papers (2022-11-24T08:26:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.