Beta Embeddings for Multi-Hop Logical Reasoning in Knowledge Graphs
- URL: http://arxiv.org/abs/2010.11465v1
- Date: Thu, 22 Oct 2020 06:11:39 GMT
- Title: Beta Embeddings for Multi-Hop Logical Reasoning in Knowledge Graphs
- Authors: Hongyu Ren, Jure Leskovec
- Abstract summary: We present BetaE, a probabilistic embedding framework for answering arbitrary FOL queries over a knowledge graph (KG)
BetaE is the first method that can handle a complete set of first-order logical operations.
We demonstrate the performance of BetaE on answering arbitrary FOL queries on three large, incomplete KGs.
- Score: 89.51365993393787
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: One of the fundamental problems in Artificial Intelligence is to perform
complex multi-hop logical reasoning over the facts captured by a knowledge
graph (KG). This problem is challenging, because KGs can be massive and
incomplete. Recent approaches embed KG entities in a low dimensional space and
then use these embeddings to find the answer entities. However, it has been an
outstanding challenge of how to handle arbitrary first-order logic (FOL)
queries as present methods are limited to only a subset of FOL operators. In
particular, the negation operator is not supported. An additional limitation of
present methods is also that they cannot naturally model uncertainty. Here, we
present BetaE, a probabilistic embedding framework for answering arbitrary FOL
queries over KGs. BetaE is the first method that can handle a complete set of
first-order logical operations: conjunction ($\wedge$), disjunction ($\vee$),
and negation ($\neg$). A key insight of BetaE is to use probabilistic
distributions with bounded support, specifically the Beta distribution, and
embed queries/entities as distributions, which as a consequence allows us to
also faithfully model uncertainty. Logical operations are performed in the
embedding space by neural operators over the probabilistic embeddings. We
demonstrate the performance of BetaE on answering arbitrary FOL queries on
three large, incomplete KGs. While being more general, BetaE also increases
relative performance by up to 25.4% over the current state-of-the-art KG
reasoning methods that can only handle conjunctive queries without negation.
Related papers
- Improving Complex Reasoning over Knowledge Graph with Logic-Aware Curriculum Tuning [89.89857766491475]
We propose a complex reasoning schema over KG upon large language models (LLMs)
We augment the arbitrary first-order logical queries via binary tree decomposition to stimulate the reasoning capability of LLMs.
Experiments across widely used datasets demonstrate that LACT has substantial improvements(brings an average +5.5% MRR score) over advanced methods.
arXiv Detail & Related papers (2024-05-02T18:12:08Z) - LINC: A Neurosymbolic Approach for Logical Reasoning by Combining
Language Models with First-Order Logic Provers [60.009969929857704]
Logical reasoning is an important task for artificial intelligence with potential impacts on science, mathematics, and society.
In this work, we reformulating such tasks as modular neurosymbolic programming, which we call LINC.
We observe significant performance gains on FOLIO and a balanced subset of ProofWriter for three different models in nearly all experimental conditions we evaluate.
arXiv Detail & Related papers (2023-10-23T17:58:40Z) - Logical Message Passing Networks with One-hop Inference on Atomic
Formulas [57.47174363091452]
We propose a framework for complex query answering that decomposes the Knowledge Graph embeddings from neural set operators.
On top of the query graph, we propose the Logical Message Passing Neural Network (LMPNN) that connects the local one-hop inferences on atomic formulas to the global logical reasoning.
Our approach yields the new state-of-the-art neural CQA model.
arXiv Detail & Related papers (2023-01-21T02:34:06Z) - GammaE: Gamma Embeddings for Logical Queries on Knowledge Graphs [8.880867550516472]
We propose a novel probabilistic embedding model, namely Gamma Embeddings (GammaE) for encoding entities and queries.
We utilize the linear property and strong boundary support of the Gamma distribution to capture more features of entities and queries.
The performance of GammaE is validated on three large logical query datasets.
arXiv Detail & Related papers (2022-10-27T16:15:11Z) - Neural Methods for Logical Reasoning Over Knowledge Graphs [14.941769519278745]
We focus on answering multi-hop logical queries on Knowledge Graphs (KGs)
Most previous works have been unable to create models that accept full First-Order Logical (FOL) queries.
We introduce a set of models that use Neural Networks to create one-point vector embeddings to answer the queries.
arXiv Detail & Related papers (2022-09-28T23:10:09Z) - Neural-Symbolic Entangled Framework for Complex Query Answering [22.663509971491138]
We propose a Neural and Entangled framework (ENeSy) for complex query answering.
It enables the neural and symbolic reasoning to enhance each other to alleviate the cascading error and KG incompleteness.
ENeSy achieves the SOTA performance on several benchmarks, especially in the setting of the training model only with the link prediction task.
arXiv Detail & Related papers (2022-09-19T06:07:10Z) - Probabilistic Entity Representation Model for Chain Reasoning over
Knowledge Graphs [18.92547855877845]
We propose a Probabilistic Entity Representation Model (PERM) for logical reasoning over Knowledge Graphs.
PERM encodes entities as a Multivariate Gaussian density with mean and covariance parameters to capture semantic position and smooth decision boundary.
We demonstrate PERM's competence on a COVID-19 drug-repurposing case study and show that our proposed work is able to recommend drugs with substantially better F1 than current methods.
arXiv Detail & Related papers (2021-10-26T09:26:10Z) - Fuzzy Logic based Logical Query Answering on Knowledge Graph [37.039516386710716]
We present FuzzQE, a fuzzy logic based query embedding framework for answering FOL queries over KGs.
FuzzyQE follows fuzzy logic to define logical operators in a principled and learning free manner.
Experiments on two benchmark datasets demonstrate that FuzzQE achieves significantly better performance in answering FOL queries.
arXiv Detail & Related papers (2021-08-05T05:54:00Z) - Logic Embeddings for Complex Query Answering [56.25151854231117]
We propose Logic Embeddings, a new approach to embedding complex queries that uses Skolemisation to eliminate existential variables for efficient querying.
We show that Logic Embeddings are competitively fast and accurate in query answering over large, incomplete knowledge graphs, outperform on negation queries, and in particular, provide improved modeling of answer uncertainty.
arXiv Detail & Related papers (2021-02-28T07:52:37Z) - Query2box: Reasoning over Knowledge Graphs in Vector Space using Box
Embeddings [84.0206612938464]
query2box is an embedding-based framework for reasoning over arbitrary queries on incomplete knowledge graphs.
We show that query2box is capable of handling arbitrary logical queries with $wedge$, $vee$, $exists$ in a scalable manner.
We demonstrate the effectiveness of query2box on three large KGs and show that query2box achieves up to 25% relative improvement over the state of the art.
arXiv Detail & Related papers (2020-02-14T11:20:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.