Fuzzy Logic based Logical Query Answering on Knowledge Graph
- URL: http://arxiv.org/abs/2108.02390v1
- Date: Thu, 5 Aug 2021 05:54:00 GMT
- Title: Fuzzy Logic based Logical Query Answering on Knowledge Graph
- Authors: Xuelu Chen, Ziniu Hu, Yizhou Sun
- Abstract summary: We present FuzzQE, a fuzzy logic based query embedding framework for answering FOL queries over KGs.
FuzzyQE follows fuzzy logic to define logical operators in a principled and learning free manner.
Experiments on two benchmark datasets demonstrate that FuzzQE achieves significantly better performance in answering FOL queries.
- Score: 37.039516386710716
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Answering complex First-Order Logical (FOL) queries on large-scale incomplete
knowledge graphs (KGs) is an important yet challenging task. Recent advances
embed logical queries and KG entities in the vector space and conduct query
answering via dense similarity search. However, most of the designed logical
operators in existing works do not satisfy the axiomatic system of classical
logic. Moreover, these logical operators are parameterized so that they require
a large number of complex FOL queries as training data, which are often arduous
or even inaccessible to collect in most real-world KGs. In this paper, we
present FuzzQE, a fuzzy logic based query embedding framework for answering FOL
queries over KGs. FuzzQE follows fuzzy logic to define logical operators in a
principled and learning free manner. Extensive experiments on two benchmark
datasets demonstrate that FuzzQE achieves significantly better performance in
answering FOL queries compared to the state-of-the-art methods. In addition,
FuzzQE trained with only KG link prediction without any complex queries can
achieve comparable performance with the systems trained with all FOL queries.
Related papers
- Effective Instruction Parsing Plugin for Complex Logical Query Answering on Knowledge Graphs [51.33342412699939]
Knowledge Graph Query Embedding (KGQE) aims to embed First-Order Logic (FOL) queries in a low-dimensional KG space for complex reasoning over incomplete KGs.
Recent studies integrate various external information (such as entity types and relation context) to better capture the logical semantics of FOL queries.
We propose an effective Query Instruction Parsing (QIPP) that captures latent query patterns from code-like query instructions.
arXiv Detail & Related papers (2024-10-27T03:18:52Z) - Improving Complex Reasoning over Knowledge Graph with Logic-Aware Curriculum Tuning [89.89857766491475]
We propose a complex reasoning schema over KG upon large language models (LLMs)
We augment the arbitrary first-order logical queries via binary tree decomposition to stimulate the reasoning capability of LLMs.
Experiments across widely used datasets demonstrate that LACT has substantial improvements(brings an average +5.5% MRR score) over advanced methods.
arXiv Detail & Related papers (2024-05-02T18:12:08Z) - Meta Operator for Complex Query Answering on Knowledge Graphs [58.340159346749964]
We argue that different logical operator types, rather than the different complex query types, are the key to improving generalizability.
We propose a meta-learning algorithm to learn the meta-operators with limited data and adapt them to different instances of operators under various complex queries.
Empirical results show that learning meta-operators is more effective than learning original CQA or meta-CQA models.
arXiv Detail & Related papers (2024-03-15T08:54:25Z) - Query Structure Modeling for Inductive Logical Reasoning Over Knowledge
Graphs [67.043747188954]
We propose a structure-modeled textual encoding framework for inductive logical reasoning over KGs.
It encodes linearized query structures and entities using pre-trained language models to find answers.
We conduct experiments on two inductive logical reasoning datasets and three transductive datasets.
arXiv Detail & Related papers (2023-05-23T01:25:29Z) - LogicRec: Recommendation with Users' Logical Requirements [29.892669049903034]
We formulate the problem of recommendation with users' logical requirements (LogicRec)
We propose an initial solution for LogicRec based on logical requirement retrieval and user preference retrieval.
arXiv Detail & Related papers (2023-04-23T18:46:58Z) - Logical Message Passing Networks with One-hop Inference on Atomic
Formulas [57.47174363091452]
We propose a framework for complex query answering that decomposes the Knowledge Graph embeddings from neural set operators.
On top of the query graph, we propose the Logical Message Passing Neural Network (LMPNN) that connects the local one-hop inferences on atomic formulas to the global logical reasoning.
Our approach yields the new state-of-the-art neural CQA model.
arXiv Detail & Related papers (2023-01-21T02:34:06Z) - Neural Methods for Logical Reasoning Over Knowledge Graphs [14.941769519278745]
We focus on answering multi-hop logical queries on Knowledge Graphs (KGs)
Most previous works have been unable to create models that accept full First-Order Logical (FOL) queries.
We introduce a set of models that use Neural Networks to create one-point vector embeddings to answer the queries.
arXiv Detail & Related papers (2022-09-28T23:10:09Z) - Neural-Symbolic Entangled Framework for Complex Query Answering [22.663509971491138]
We propose a Neural and Entangled framework (ENeSy) for complex query answering.
It enables the neural and symbolic reasoning to enhance each other to alleviate the cascading error and KG incompleteness.
ENeSy achieves the SOTA performance on several benchmarks, especially in the setting of the training model only with the link prediction task.
arXiv Detail & Related papers (2022-09-19T06:07:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.