Probabilistic Entity Representation Model for Chain Reasoning over
Knowledge Graphs
- URL: http://arxiv.org/abs/2110.13522v1
- Date: Tue, 26 Oct 2021 09:26:10 GMT
- Title: Probabilistic Entity Representation Model for Chain Reasoning over
Knowledge Graphs
- Authors: Nurendra Choudhary, Nikhil Rao, Sumeet Katariya, Karthik Subbian,
Chandan K. Reddy
- Abstract summary: We propose a Probabilistic Entity Representation Model (PERM) for logical reasoning over Knowledge Graphs.
PERM encodes entities as a Multivariate Gaussian density with mean and covariance parameters to capture semantic position and smooth decision boundary.
We demonstrate PERM's competence on a COVID-19 drug-repurposing case study and show that our proposed work is able to recommend drugs with substantially better F1 than current methods.
- Score: 18.92547855877845
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Logical reasoning over Knowledge Graphs (KGs) is a fundamental technique that
can provide efficient querying mechanism over large and incomplete databases.
Current approaches employ spatial geometries such as boxes to learn query
representations that encompass the answer entities and model the logical
operations of projection and intersection. However, their geometry is
restrictive and leads to non-smooth strict boundaries, which further results in
ambiguous answer entities. Furthermore, previous works propose transformation
tricks to handle unions which results in non-closure and, thus, cannot be
chained in a stream. In this paper, we propose a Probabilistic Entity
Representation Model (PERM) to encode entities as a Multivariate Gaussian
density with mean and covariance parameters to capture its semantic position
and smooth decision boundary, respectively. Additionally, we also define the
closed logical operations of projection, intersection, and union that can be
aggregated using an end-to-end objective function. On the logical query
reasoning problem, we demonstrate that the proposed PERM significantly
outperforms the state-of-the-art methods on various public benchmark KG
datasets on standard evaluation metrics. We also evaluate PERM's competence on
a COVID-19 drug-repurposing case study and show that our proposed work is able
to recommend drugs with substantially better F1 than current methods. Finally,
we demonstrate the working of our PERM's query answering process through a
low-dimensional visualization of the Gaussian representations.
Related papers
- Posets and Bounded Probabilities for Discovering Order-inducing Features in Event Knowledge Graphs [6.96958458974878]
Event knowledge graphs (EKG) capture multiple, interacting views of a process execution.
We tackle the open problem of EKG discovery from uncurated data.
We derive an EKG discovery algorithm based on statistical inference.
arXiv Detail & Related papers (2024-10-08T14:12:51Z) - Modeling Relational Patterns for Logical Query Answering over Knowledge Graphs [29.47155614953955]
We develop a novel query embedding method, RoConE, that defines query regions as geometric cones and algebraic query operators by rotations in complex space.
Our experimental results on several benchmark datasets confirm the advantage of relational patterns for enhancing logical query answering task.
arXiv Detail & Related papers (2023-03-21T13:59:15Z) - Self-Supervised Learning via Maximum Entropy Coding [57.56570417545023]
We propose Maximum Entropy Coding (MEC) as a principled objective that explicitly optimize on the structure of the representation.
MEC learns a more generalizable representation than previous methods based on specific pretext tasks.
It achieves state-of-the-art performance consistently on various downstream tasks, including not only ImageNet linear probe, but also semi-supervised classification, object detection, instance segmentation, and object tracking.
arXiv Detail & Related papers (2022-10-20T17:58:30Z) - Neural-Symbolic Entangled Framework for Complex Query Answering [22.663509971491138]
We propose a Neural and Entangled framework (ENeSy) for complex query answering.
It enables the neural and symbolic reasoning to enhance each other to alleviate the cascading error and KG incompleteness.
ENeSy achieves the SOTA performance on several benchmarks, especially in the setting of the training model only with the link prediction task.
arXiv Detail & Related papers (2022-09-19T06:07:10Z) - PIE: a Parameter and Inference Efficient Solution for Large Scale
Knowledge Graph Embedding Reasoning [24.29409958504209]
We propose PIE, a textbfparameter and textbfinference textbfefficient solution.
Inspired from tensor decomposition methods, we find that decompose entity embedding matrix into low rank matrices can reduce more than half of the parameters.
To accelerate model inference, we propose a self-supervised auxiliary task, which can be seen as fine-grained entity typing.
arXiv Detail & Related papers (2022-04-29T09:06:56Z) - ConE: Cone Embeddings for Multi-Hop Reasoning over Knowledge Graphs [73.86041481470261]
Cone Embeddings (ConE) is the first geometry-based query embedding model that can handle conjunction, disjunction, and negation.
ConE significantly outperforms existing state-of-the-art methods on benchmark datasets.
arXiv Detail & Related papers (2021-10-26T14:04:02Z) - Unsupervised Knowledge Graph Alignment by Probabilistic Reasoning and
Semantic Embedding [22.123001954919893]
We propose an iterative framework named PRASE which is based on probabilistic reasoning and semantic embedding.
The PRASE framework is compatible with different embedding-based models, and our experiments on multiple datasets have demonstrated its state-of-the-art performance.
arXiv Detail & Related papers (2021-05-12T11:27:46Z) - How Fine-Tuning Allows for Effective Meta-Learning [50.17896588738377]
We present a theoretical framework for analyzing representations derived from a MAML-like algorithm.
We provide risk bounds on the best predictor found by fine-tuning via gradient descent, demonstrating that the algorithm can provably leverage the shared structure.
This separation result underscores the benefit of fine-tuning-based methods, such as MAML, over methods with "frozen representation" objectives in few-shot learning.
arXiv Detail & Related papers (2021-05-05T17:56:00Z) - Probabilistic Case-based Reasoning for Open-World Knowledge Graph
Completion [59.549664231655726]
A case-based reasoning (CBR) system solves a new problem by retrieving cases' that are similar to the given problem.
In this paper, we demonstrate that such a system is achievable for reasoning in knowledge-bases (KBs)
Our approach predicts attributes for an entity by gathering reasoning paths from similar entities in the KB.
arXiv Detail & Related papers (2020-10-07T17:48:12Z) - Closed-Form Factorization of Latent Semantics in GANs [65.42778970898534]
A rich set of interpretable dimensions has been shown to emerge in the latent space of the Generative Adversarial Networks (GANs) trained for synthesizing images.
In this work, we examine the internal representation learned by GANs to reveal the underlying variation factors in an unsupervised manner.
We propose a closed-form factorization algorithm for latent semantic discovery by directly decomposing the pre-trained weights.
arXiv Detail & Related papers (2020-07-13T18:05:36Z) - Polynomial-Time Exact MAP Inference on Discrete Models with Global
Dependencies [83.05591911173332]
junction tree algorithm is the most general solution for exact MAP inference with run-time guarantees.
We propose a new graph transformation technique via node cloning which ensures a run-time for solving our target problem independently of the form of a corresponding clique tree.
arXiv Detail & Related papers (2019-12-27T13:30:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.