Why Settle for Just One? Extending EL++ Ontology Embeddings with
Many-to-Many Relationships
- URL: http://arxiv.org/abs/2110.10555v1
- Date: Wed, 20 Oct 2021 13:23:18 GMT
- Title: Why Settle for Just One? Extending EL++ Ontology Embeddings with
Many-to-Many Relationships
- Authors: Biswesh Mohapatra, Sumit Bhatia, Raghava Mutharaju and G.
Srinivasaraghavan
- Abstract summary: Knowledge Graph embeddings provide a low-dimensional representation of entities and relations of a Knowledge Graph.
Recent efforts in this direction involve learning embeddings for a Description (logical Logic for a description) named EL++.
We provide a simple and effective solution that allows such methods to consider many-to-many relationships while learning embedding representations.
- Score: 2.599882743586164
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge Graph (KG) embeddings provide a low-dimensional representation of
entities and relations of a Knowledge Graph and are used successfully for
various applications such as question answering and search, reasoning,
inference, and missing link prediction. However, most of the existing KG
embeddings only consider the network structure of the graph and ignore the
semantics and the characteristics of the underlying ontology that provides
crucial information about relationships between entities in the KG. Recent
efforts in this direction involve learning embeddings for a Description Logic
(logical underpinning for ontologies) named EL++. However, such methods
consider all the relations defined in the ontology to be one-to-one which
severely limits their performance and applications. We provide a simple and
effective solution to overcome this shortcoming that allows such methods to
consider many-to-many relationships while learning embedding representations.
Experiments conducted using three different EL++ ontologies show substantial
performance improvement over five baselines. Our proposed solution also paves
the way for learning embedding representations for even more expressive
description logics such as SROIQ.
Related papers
- Improving Multi-hop Logical Reasoning in Knowledge Graphs with Context-Aware Query Representation Learning [3.7411114598484647]
Multi-hop logical reasoning on knowledge graphs is a pivotal task in natural language processing.
We propose a model-agnostic methodology that enhances the effectiveness of existing multi-hop logical reasoning approaches.
Our method consistently enhances the three multi-hop reasoning foundation models, achieving performance improvements of up to 19.5%.
arXiv Detail & Related papers (2024-06-11T07:48:20Z) - Improving Complex Reasoning over Knowledge Graph with Logic-Aware Curriculum Tuning [89.89857766491475]
We propose a complex reasoning schema over KG upon large language models (LLMs)
We augment the arbitrary first-order logical queries via binary tree decomposition to stimulate the reasoning capability of LLMs.
Experiments across widely used datasets demonstrate that LACT has substantial improvements(brings an average +5.5% MRR score) over advanced methods.
arXiv Detail & Related papers (2024-05-02T18:12:08Z) - Query Structure Modeling for Inductive Logical Reasoning Over Knowledge
Graphs [67.043747188954]
We propose a structure-modeled textual encoding framework for inductive logical reasoning over KGs.
It encodes linearized query structures and entities using pre-trained language models to find answers.
We conduct experiments on two inductive logical reasoning datasets and three transductive datasets.
arXiv Detail & Related papers (2023-05-23T01:25:29Z) - Complex Logical Reasoning over Knowledge Graphs using Large Language Models [13.594992599230277]
Reasoning over knowledge graphs (KGs) is a challenging task that requires a deep understanding of the relationships between entities.
Current approaches rely on learning geometries to embed entities in vector space for logical query operations.
We propose a novel decoupled approach, Language-guided Abstract Reasoning over Knowledge graphs (LARK), that formulates complex KG reasoning as a combination of contextual KG search and logical query reasoning.
arXiv Detail & Related papers (2023-05-02T02:21:49Z) - Entity-Agnostic Representation Learning for Parameter-Efficient
Knowledge Graph Embedding [30.7075844882004]
We propose an entity-agnostic representation learning method for handling the problem of inefficient parameter storage costs brought by embedding knowledge graphs.
We learn universal and entity-agnostic encoders for transforming distinguishable information into entity embeddings.
Experimental results show that EARL uses fewer parameters and performs better on link prediction tasks than baselines.
arXiv Detail & Related papers (2023-02-03T16:49:46Z) - Dual Box Embeddings for the Description Logic EL++ [16.70961576041243]
Similar to Knowledge Graphs (KGs), Knowledge Graphs are often incomplete, and maintaining and constructing them has proved challenging.
Similar to KGs, a promising approach is to learn embeddings in a latent vector space, while additionally ensuring they adhere to the semantics of the underlying DL.
We propose a novel ontology embedding method named Box$2$EL for the DL EL++, which represents both concepts and roles as boxes.
arXiv Detail & Related papers (2023-01-26T14:13:37Z) - UniKGQA: Unified Retrieval and Reasoning for Solving Multi-hop Question
Answering Over Knowledge Graph [89.98762327725112]
Multi-hop Question Answering over Knowledge Graph(KGQA) aims to find the answer entities that are multiple hops away from the topic entities mentioned in a natural language question.
We propose UniKGQA, a novel approach for multi-hop KGQA task, by unifying retrieval and reasoning in both model architecture and parameter learning.
arXiv Detail & Related papers (2022-12-02T04:08:09Z) - Inductive Logical Query Answering in Knowledge Graphs [30.220508024471595]
We study the inductive query answering task where inference is performed on a graph containing new entities with queries over both seen and unseen entities.
We devise two mechanisms leveraging inductive node and relational structure representations powered by graph neural networks (GNNs)
Experimentally, we show that inductive models are able to perform logical reasoning at inference time over unseen nodes generalizing to graphs up to 500% larger than training ones.
arXiv Detail & Related papers (2022-10-13T03:53:34Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z) - Joint Semantics and Data-Driven Path Representation for Knowledge Graph
Inference [60.048447849653876]
We propose a novel joint semantics and data-driven path representation that balances explainability and generalization in the framework of KG embedding.
Our proposed model is evaluated on two classes of tasks: link prediction and path query answering task.
arXiv Detail & Related papers (2020-10-06T10:24:45Z) - Dynamic Language Binding in Relational Visual Reasoning [67.85579756590478]
We present Language-binding Object Graph Network, the first neural reasoning method with dynamic relational structures across both visual and textual domains.
Our method outperforms other methods in sophisticated question-answering tasks wherein multiple object relations are involved.
arXiv Detail & Related papers (2020-04-30T06:26:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.