Conditional Logical Message Passing Transformer for Complex Query Answering
- URL: http://arxiv.org/abs/2402.12954v2
- Date: Sat, 10 Aug 2024 10:15:47 GMT
- Title: Conditional Logical Message Passing Transformer for Complex Query Answering
- Authors: Chongzhi Zhang, Zhiping Peng, Junhao Zheng, Qianli Ma,
- Abstract summary: We propose a new state-of-the-art neural CQA model, Conditional Logical Message Passing Transformer (CLMPT)
We empirically verified that this approach can reduce computational costs without affecting performance.
Experimental results show that CLMPT is a new state-of-the-art neural CQA model.
- Score: 22.485655410582375
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Complex Query Answering (CQA) over Knowledge Graphs (KGs) is a challenging task. Given that KGs are usually incomplete, neural models are proposed to solve CQA by performing multi-hop logical reasoning. However, most of them cannot perform well on both one-hop and multi-hop queries simultaneously. Recent work proposes a logical message passing mechanism based on the pre-trained neural link predictors. While effective on both one-hop and multi-hop queries, it ignores the difference between the constant and variable nodes in a query graph. In addition, during the node embedding update stage, this mechanism cannot dynamically measure the importance of different messages, and whether it can capture the implicit logical dependencies related to a node and received messages remains unclear. In this paper, we propose Conditional Logical Message Passing Transformer (CLMPT), which considers the difference between constants and variables in the case of using pre-trained neural link predictors and performs message passing conditionally on the node type. We empirically verified that this approach can reduce computational costs without affecting performance. Furthermore, CLMPT uses the transformer to aggregate received messages and update the corresponding node embedding. Through the self-attention mechanism, CLMPT can assign adaptive weights to elements in an input set consisting of received messages and the corresponding node and explicitly model logical dependencies between various elements. Experimental results show that CLMPT is a new state-of-the-art neural CQA model. https://github.com/qianlima-lab/CLMPT.
Related papers
- Towards Dynamic Message Passing on Graphs [104.06474765596687]
We propose a novel dynamic message-passing mechanism for graph neural networks (GNNs)
It projects graph nodes and learnable pseudo nodes into a common space with measurable spatial relations between them.
With nodes moving in the space, their evolving relations facilitate flexible pathway construction for a dynamic message-passing process.
arXiv Detail & Related papers (2024-10-31T07:20:40Z) - Effective Instruction Parsing Plugin for Complex Logical Query Answering on Knowledge Graphs [51.33342412699939]
Knowledge Graph Query Embedding (KGQE) aims to embed First-Order Logic (FOL) queries in a low-dimensional KG space for complex reasoning over incomplete KGs.
Recent studies integrate various external information (such as entity types and relation context) to better capture the logical semantics of FOL queries.
We propose an effective Query Instruction Parsing (QIPP) that captures latent query patterns from code-like query instructions.
arXiv Detail & Related papers (2024-10-27T03:18:52Z) - Pathformer: Recursive Path Query Encoding for Complex Logical Query Answering [20.521886749524814]
We propose a neural one-point embedding method called Pathformer based on the tree-like computation graph, i.e., query tree.
Specifically, Pathformer decomposes the query computation tree into path query sequences by branches.
This allows Pathformer to fully utilize future context information to explicitly model the complex interactions between various parts of a path query.
arXiv Detail & Related papers (2024-06-21T06:02:58Z) - ReasoningLM: Enabling Structural Subgraph Reasoning in Pre-trained
Language Models for Question Answering over Knowledge Graph [142.42275983201978]
We propose a subgraph-aware self-attention mechanism to imitate the GNN for performing structured reasoning.
We also adopt an adaptation tuning strategy to adapt the model parameters with 20,000 subgraphs with synthesized questions.
Experiments show that ReasoningLM surpasses state-of-the-art models by a large margin, even with fewer updated parameters and less training data.
arXiv Detail & Related papers (2023-12-30T07:18:54Z) - Tractable Bounding of Counterfactual Queries by Knowledge Compilation [51.47174989680976]
We discuss the problem of bounding partially identifiable queries, such as counterfactuals, in Pearlian structural causal models.
A recently proposed iterated EM scheme yields an inner approximation of those bounds by sampling the initialisation parameters.
We show how a single symbolic knowledge compilation allows us to obtain the circuit structure with symbolic parameters to be replaced by their actual values.
arXiv Detail & Related papers (2023-10-05T07:10:40Z) - Sequential Query Encoding For Complex Query Answering on Knowledge
Graphs [31.40820604209387]
We propose sequential query encoding (SQE) as an alternative to encode queries for knowledge graph (KG) reasoning.
SQE first uses a search-based algorithm to linearize the computational graph to a sequence of tokens and then uses a sequence encoder to compute its vector representation.
Despite its simplicity, SQE demonstrates state-of-the-art neural query encoding performance on FB15k, FB15k-237, and NELL.
arXiv Detail & Related papers (2023-02-25T16:33:53Z) - Logical Message Passing Networks with One-hop Inference on Atomic
Formulas [57.47174363091452]
We propose a framework for complex query answering that decomposes the Knowledge Graph embeddings from neural set operators.
On top of the query graph, we propose the Logical Message Passing Neural Network (LMPNN) that connects the local one-hop inferences on atomic formulas to the global logical reasoning.
Our approach yields the new state-of-the-art neural CQA model.
arXiv Detail & Related papers (2023-01-21T02:34:06Z) - UniKGQA: Unified Retrieval and Reasoning for Solving Multi-hop Question
Answering Over Knowledge Graph [89.98762327725112]
Multi-hop Question Answering over Knowledge Graph(KGQA) aims to find the answer entities that are multiple hops away from the topic entities mentioned in a natural language question.
We propose UniKGQA, a novel approach for multi-hop KGQA task, by unifying retrieval and reasoning in both model architecture and parameter learning.
arXiv Detail & Related papers (2022-12-02T04:08:09Z) - Neural-Symbolic Entangled Framework for Complex Query Answering [22.663509971491138]
We propose a Neural and Entangled framework (ENeSy) for complex query answering.
It enables the neural and symbolic reasoning to enhance each other to alleviate the cascading error and KG incompleteness.
ENeSy achieves the SOTA performance on several benchmarks, especially in the setting of the training model only with the link prediction task.
arXiv Detail & Related papers (2022-09-19T06:07:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.