Logical Message Passing Networks with One-hop Inference on Atomic
Formulas
- URL: http://arxiv.org/abs/2301.08859v4
- Date: Sat, 26 Aug 2023 07:45:41 GMT
- Title: Logical Message Passing Networks with One-hop Inference on Atomic
Formulas
- Authors: Zihao Wang, Yangqiu Song, Ginny Y. Wong, Simon See
- Abstract summary: We propose a framework for complex query answering that decomposes the Knowledge Graph embeddings from neural set operators.
On top of the query graph, we propose the Logical Message Passing Neural Network (LMPNN) that connects the local one-hop inferences on atomic formulas to the global logical reasoning.
Our approach yields the new state-of-the-art neural CQA model.
- Score: 57.47174363091452
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Complex Query Answering (CQA) over Knowledge Graphs (KGs) has attracted a lot
of attention to potentially support many applications. Given that KGs are
usually incomplete, neural models are proposed to answer the logical queries by
parameterizing set operators with complex neural networks. However, such
methods usually train neural set operators with a large number of entity and
relation embeddings from the zero, where whether and how the embeddings or the
neural set operators contribute to the performance remains not clear. In this
paper, we propose a simple framework for complex query answering that
decomposes the KG embeddings from neural set operators. We propose to represent
the complex queries into the query graph. On top of the query graph, we propose
the Logical Message Passing Neural Network (LMPNN) that connects the local
one-hop inferences on atomic formulas to the global logical reasoning for
complex query answering. We leverage existing effective KG embeddings to
conduct one-hop inferences on atomic formulas, the results of which are
regarded as the messages passed in LMPNN. The reasoning process over the
overall logical formulas is turned into the forward pass of LMPNN that
incrementally aggregates local information to finally predict the answers'
embeddings. The complex logical inference across different types of queries
will then be learned from training examples based on the LMPNN architecture.
Theoretically, our query-graph represenation is more general than the
prevailing operator-tree formulation, so our approach applies to a broader
range of complex KG queries. Empirically, our approach yields the new
state-of-the-art neural CQA model. Our research bridges the gap between complex
KG query answering tasks and the long-standing achievements of knowledge graph
representation learning.
Related papers
- Improving Complex Reasoning over Knowledge Graph with Logic-Aware Curriculum Tuning [89.89857766491475]
We propose a complex reasoning schema over KG upon large language models (LLMs)
We augment the arbitrary first-order logical queries via binary tree decomposition to stimulate the reasoning capability of LLMs.
Experiments across widely used datasets demonstrate that LACT has substantial improvements(brings an average +5.5% MRR score) over advanced methods.
arXiv Detail & Related papers (2024-05-02T18:12:08Z) - Meta Operator for Complex Query Answering on Knowledge Graphs [58.340159346749964]
We argue that different logical operator types, rather than the different complex query types, are the key to improving generalizability.
We propose a meta-learning algorithm to learn the meta-operators with limited data and adapt them to different instances of operators under various complex queries.
Empirical results show that learning meta-operators is more effective than learning original CQA or meta-CQA models.
arXiv Detail & Related papers (2024-03-15T08:54:25Z) - ReasoningLM: Enabling Structural Subgraph Reasoning in Pre-trained
Language Models for Question Answering over Knowledge Graph [142.42275983201978]
We propose a subgraph-aware self-attention mechanism to imitate the GNN for performing structured reasoning.
We also adopt an adaptation tuning strategy to adapt the model parameters with 20,000 subgraphs with synthesized questions.
Experiments show that ReasoningLM surpasses state-of-the-art models by a large margin, even with fewer updated parameters and less training data.
arXiv Detail & Related papers (2023-12-30T07:18:54Z) - Query Structure Modeling for Inductive Logical Reasoning Over Knowledge
Graphs [67.043747188954]
We propose a structure-modeled textual encoding framework for inductive logical reasoning over KGs.
It encodes linearized query structures and entities using pre-trained language models to find answers.
We conduct experiments on two inductive logical reasoning datasets and three transductive datasets.
arXiv Detail & Related papers (2023-05-23T01:25:29Z) - Inductive Logical Query Answering in Knowledge Graphs [30.220508024471595]
We study the inductive query answering task where inference is performed on a graph containing new entities with queries over both seen and unseen entities.
We devise two mechanisms leveraging inductive node and relational structure representations powered by graph neural networks (GNNs)
Experimentally, we show that inductive models are able to perform logical reasoning at inference time over unseen nodes generalizing to graphs up to 500% larger than training ones.
arXiv Detail & Related papers (2022-10-13T03:53:34Z) - Neural-Symbolic Entangled Framework for Complex Query Answering [22.663509971491138]
We propose a Neural and Entangled framework (ENeSy) for complex query answering.
It enables the neural and symbolic reasoning to enhance each other to alleviate the cascading error and KG incompleteness.
ENeSy achieves the SOTA performance on several benchmarks, especially in the setting of the training model only with the link prediction task.
arXiv Detail & Related papers (2022-09-19T06:07:10Z) - Neural-Symbolic Models for Logical Queries on Knowledge Graphs [17.290758383645567]
We propose Graph Neural Network Query Executor (GNN-QE), a neural-symbolic model that enjoys the advantages of both worlds.
GNN-QE decomposes a complex FOL query into relation projections and logical operations over fuzzy sets.
Experiments on 3 datasets show that GNN-QE significantly improves over previous state-of-the-art models in answering FOL queries.
arXiv Detail & Related papers (2022-05-16T18:39:04Z) - Knowledge Base Question Answering by Case-based Reasoning over Subgraphs [81.22050011503933]
We show that our model answers queries requiring complex reasoning patterns more effectively than existing KG completion algorithms.
The proposed model outperforms or performs competitively with state-of-the-art models on several KBQA benchmarks.
arXiv Detail & Related papers (2022-02-22T01:34:35Z) - Fuzzy Logic based Logical Query Answering on Knowledge Graph [37.039516386710716]
We present FuzzQE, a fuzzy logic based query embedding framework for answering FOL queries over KGs.
FuzzyQE follows fuzzy logic to define logical operators in a principled and learning free manner.
Experiments on two benchmark datasets demonstrate that FuzzQE achieves significantly better performance in answering FOL queries.
arXiv Detail & Related papers (2021-08-05T05:54:00Z) - Query Embedding on Hyper-relational Knowledge Graphs [0.4779196219827507]
Multi-hop logical reasoning is an established problem in the field of representation learning on knowledge graphs.
We extend the multi-hop reasoning problem to hyper-relational KGs allowing to tackle this new type of complex queries.
arXiv Detail & Related papers (2021-06-15T14:08:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.