Why Settle for Just One? Extending EL++ Ontology Embeddings with
Many-to-Many Relationships
- URL: http://arxiv.org/abs/2110.10555v1
- Date: Wed, 20 Oct 2021 13:23:18 GMT
- Title: Why Settle for Just One? Extending EL++ Ontology Embeddings with
Many-to-Many Relationships
- Authors: Biswesh Mohapatra, Sumit Bhatia, Raghava Mutharaju and G.
Srinivasaraghavan
- Abstract summary: Knowledge Graph embeddings provide a low-dimensional representation of entities and relations of a Knowledge Graph.
Recent efforts in this direction involve learning embeddings for a Description (logical Logic for a description) named EL++.
We provide a simple and effective solution that allows such methods to consider many-to-many relationships while learning embedding representations.
- Score: 2.599882743586164
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge Graph (KG) embeddings provide a low-dimensional representation of
entities and relations of a Knowledge Graph and are used successfully for
various applications such as question answering and search, reasoning,
inference, and missing link prediction. However, most of the existing KG
embeddings only consider the network structure of the graph and ignore the
semantics and the characteristics of the underlying ontology that provides
crucial information about relationships between entities in the KG. Recent
efforts in this direction involve learning embeddings for a Description Logic
(logical underpinning for ontologies) named EL++. However, such methods
consider all the relations defined in the ontology to be one-to-one which
severely limits their performance and applications. We provide a simple and
effective solution to overcome this shortcoming that allows such methods to
consider many-to-many relationships while learning embedding representations.
Experiments conducted using three different EL++ ontologies show substantial
performance improvement over five baselines. Our proposed solution also paves
the way for learning embedding representations for even more expressive
description logics such as SROIQ.
Related papers
- Inference over Unseen Entities, Relations and Literals on Knowledge Graphs [1.7474352892977463]
knowledge graph embedding models have been successfully applied in the transductive setting to tackle various challenging tasks.
We propose the attentive byte-pair encoding layer (BytE) to construct a triple embedding from a sequence of byte-pair encoded subword units of entities and relations.
BytE leads to massive feature reuse via weight tying, since it forces a knowledge graph embedding model to learn embeddings for subword units instead of entities and relations directly.
arXiv Detail & Related papers (2024-10-09T10:20:54Z) - Konstruktor: A Strong Baseline for Simple Knowledge Graph Question Answering [60.6042489577575]
We introduce Konstruktor - an efficient and robust approach that breaks down the problem into three steps.
Our approach integrates language models and knowledge graphs, exploiting the power of the former and the interpretability of the latter.
We show that for relation detection, the most challenging step of the workflow, a combination of relation classification/generation and ranking outperforms other methods.
arXiv Detail & Related papers (2024-09-24T09:19:11Z) - Reasoning Graph Enhanced Exemplars Retrieval for In-Context Learning [13.381974811214764]
Reasoning Graph-enhanced Exemplar Retrieval(RGER)
RGER uses graph kernel to select exemplars with semantic and structural similarity.
The efficacy of RGER on math and logit reasoning tasks showcases its superiority over state-of-the-art retrieval-based approaches.
arXiv Detail & Related papers (2024-09-17T12:58:29Z) - Improving Complex Reasoning over Knowledge Graph with Logic-Aware Curriculum Tuning [89.89857766491475]
We propose a complex reasoning schema over KG upon large language models (LLMs)
We augment the arbitrary first-order logical queries via binary tree decomposition to stimulate the reasoning capability of LLMs.
Experiments across widely used datasets demonstrate that LACT has substantial improvements(brings an average +5.5% MRR score) over advanced methods.
arXiv Detail & Related papers (2024-05-02T18:12:08Z) - Entity-Agnostic Representation Learning for Parameter-Efficient
Knowledge Graph Embedding [30.7075844882004]
We propose an entity-agnostic representation learning method for handling the problem of inefficient parameter storage costs brought by embedding knowledge graphs.
We learn universal and entity-agnostic encoders for transforming distinguishable information into entity embeddings.
Experimental results show that EARL uses fewer parameters and performs better on link prediction tasks than baselines.
arXiv Detail & Related papers (2023-02-03T16:49:46Z) - Dual Box Embeddings for the Description Logic EL++ [16.70961576041243]
Similar to Knowledge Graphs (KGs), Knowledge Graphs are often incomplete, and maintaining and constructing them has proved challenging.
Similar to KGs, a promising approach is to learn embeddings in a latent vector space, while additionally ensuring they adhere to the semantics of the underlying DL.
We propose a novel ontology embedding method named Box$2$EL for the DL EL++, which represents both concepts and roles as boxes.
arXiv Detail & Related papers (2023-01-26T14:13:37Z) - UniKGQA: Unified Retrieval and Reasoning for Solving Multi-hop Question
Answering Over Knowledge Graph [89.98762327725112]
Multi-hop Question Answering over Knowledge Graph(KGQA) aims to find the answer entities that are multiple hops away from the topic entities mentioned in a natural language question.
We propose UniKGQA, a novel approach for multi-hop KGQA task, by unifying retrieval and reasoning in both model architecture and parameter learning.
arXiv Detail & Related papers (2022-12-02T04:08:09Z) - Inductive Logical Query Answering in Knowledge Graphs [30.220508024471595]
We study the inductive query answering task where inference is performed on a graph containing new entities with queries over both seen and unseen entities.
We devise two mechanisms leveraging inductive node and relational structure representations powered by graph neural networks (GNNs)
Experimentally, we show that inductive models are able to perform logical reasoning at inference time over unseen nodes generalizing to graphs up to 500% larger than training ones.
arXiv Detail & Related papers (2022-10-13T03:53:34Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z) - Joint Semantics and Data-Driven Path Representation for Knowledge Graph
Inference [60.048447849653876]
We propose a novel joint semantics and data-driven path representation that balances explainability and generalization in the framework of KG embedding.
Our proposed model is evaluated on two classes of tasks: link prediction and path query answering task.
arXiv Detail & Related papers (2020-10-06T10:24:45Z) - Dynamic Language Binding in Relational Visual Reasoning [67.85579756590478]
We present Language-binding Object Graph Network, the first neural reasoning method with dynamic relational structures across both visual and textual domains.
Our method outperforms other methods in sophisticated question-answering tasks wherein multiple object relations are involved.
arXiv Detail & Related papers (2020-04-30T06:26:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.