Learning Attention-based Representations from Multiple Patterns for
Relation Prediction in Knowledge Graphs
- URL: http://arxiv.org/abs/2206.04801v1
- Date: Tue, 7 Jun 2022 10:53:35 GMT
- Title: Learning Attention-based Representations from Multiple Patterns for
Relation Prediction in Knowledge Graphs
- Authors: V\'itor Louren\c{c}o and Aline Paes
- Abstract summary: AEMP is a novel model for learning contextualized representations by acquiring entities' context information.
AEMP either outperforms or competes with state-of-the-art relation prediction methods.
- Score: 2.4028383570062606
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Knowledge bases, and their representations in the form of knowledge graphs
(KGs), are naturally incomplete. Since scientific and industrial applications
have extensively adopted them, there is a high demand for solutions that
complete their information. Several recent works tackle this challenge by
learning embeddings for entities and relations, then employing them to predict
new relations among the entities. Despite their aggrandizement, most of those
methods focus only on the local neighbors of a relation to learn the
embeddings. As a result, they may fail to capture the KGs' context information
by neglecting long-term dependencies and the propagation of entities'
semantics. In this manuscript, we propose {\AE}MP (Attention-based Embeddings
from Multiple Patterns), a novel model for learning contextualized
representations by: (i) acquiring entities' context information through an
attention-enhanced message-passing scheme, which captures the entities' local
semantics while focusing on different aspects of their neighborhood; and (ii)
capturing the semantic context, by leveraging the paths and their relationships
between entities. Our empirical findings draw insights into how attention
mechanisms can improve entities' context representation and how combining
entities and semantic path contexts improves the general representation of
entities and the relation predictions. Experimental results on several large
and small knowledge graph benchmarks show that {\AE}MP either outperforms or
competes with state-of-the-art relation prediction methods.
Related papers
- Inference over Unseen Entities, Relations and Literals on Knowledge Graphs [1.7474352892977463]
knowledge graph embedding models have been successfully applied in the transductive setting to tackle various challenging tasks.
We propose the attentive byte-pair encoding layer (BytE) to construct a triple embedding from a sequence of byte-pair encoded subword units of entities and relations.
BytE leads to massive feature reuse via weight tying, since it forces a knowledge graph embedding model to learn embeddings for subword units instead of entities and relations directly.
arXiv Detail & Related papers (2024-10-09T10:20:54Z) - Knowledge Graphs and Pre-trained Language Models enhanced Representation Learning for Conversational Recommender Systems [58.561904356651276]
We introduce the Knowledge-Enhanced Entity Representation Learning (KERL) framework to improve the semantic understanding of entities for Conversational recommender systems.
KERL uses a knowledge graph and a pre-trained language model to improve the semantic understanding of entities.
KERL achieves state-of-the-art results in both recommendation and response generation tasks.
arXiv Detail & Related papers (2023-12-18T06:41:23Z) - Knowledge-Enhanced Hierarchical Information Correlation Learning for
Multi-Modal Rumor Detection [82.94413676131545]
We propose a novel knowledge-enhanced hierarchical information correlation learning approach (KhiCL) for multi-modal rumor detection.
KhiCL exploits cross-modal joint dictionary to transfer the heterogeneous unimodality features into the common feature space.
It extracts visual and textual entities from images and text, and designs a knowledge relevance reasoning strategy.
arXiv Detail & Related papers (2023-06-28T06:08:20Z) - Explainable Representations for Relation Prediction in Knowledge Graphs [0.0]
We propose SEEK, a novel approach for explainable representations to support relation prediction in knowledge graphs.
It is based on identifying relevant shared semantic aspects between entities and learning representations for each subgraph.
We evaluate SEEK on two real-world relation prediction tasks: protein-protein interaction prediction and gene-disease association prediction.
arXiv Detail & Related papers (2023-06-22T06:18:40Z) - Message Intercommunication for Inductive Relation Reasoning [49.731293143079455]
We develop a novel inductive relation reasoning model called MINES.
We introduce a Message Intercommunication mechanism on the Neighbor-Enhanced Subgraph.
Our experiments show that MINES outperforms existing state-of-the-art models.
arXiv Detail & Related papers (2023-05-23T13:51:46Z) - Topics as Entity Clusters: Entity-based Topics from Large Language Models and Graph Neural Networks [0.6486052012623045]
We propose a novel topic clustering approach using bimodal vector representations of entities.
Our approach is better suited to working with entities in comparison to state-of-the-art models.
arXiv Detail & Related papers (2023-01-06T10:54:54Z) - Knowledge Graph Augmented Network Towards Multiview Representation
Learning for Aspect-based Sentiment Analysis [96.53859361560505]
We propose a knowledge graph augmented network (KGAN) to incorporate external knowledge with explicitly syntactic and contextual information.
KGAN captures the sentiment feature representations from multiple perspectives, i.e., context-, syntax- and knowledge-based.
Experiments on three popular ABSA benchmarks demonstrate the effectiveness and robustness of our KGAN.
arXiv Detail & Related papers (2022-01-13T08:25:53Z) - Unified Graph Structured Models for Video Understanding [93.72081456202672]
We propose a message passing graph neural network that explicitly models relational-temporal relations.
We show how our method is able to more effectively model relationships between relevant entities in the scene.
arXiv Detail & Related papers (2021-03-29T14:37:35Z) - DisenE: Disentangling Knowledge Graph Embeddings [33.169388832519]
DisenE is an end-to-end framework to learn disentangled knowledge graph embeddings.
We introduce an attention-based mechanism that enables the model to explicitly focus on relevant components of entity embeddings according to a given relation.
arXiv Detail & Related papers (2020-10-28T03:45:19Z) - Explainable Link Prediction for Emerging Entities in Knowledge Graphs [44.87285668747474]
Cross-domain knowledge graphs suffer from inherent incompleteness and sparsity.
Link prediction can alleviate this by inferring a target entity, given a source entity and a query relation.
We propose an inductive representation learning framework that is able to learn representations of previously unseen entities.
arXiv Detail & Related papers (2020-05-01T22:17:37Z) - Exploiting Structured Knowledge in Text via Graph-Guided Representation
Learning [73.0598186896953]
We present two self-supervised tasks learning over raw text with the guidance from knowledge graphs.
Building upon entity-level masked language models, our first contribution is an entity masking scheme.
In contrast to existing paradigms, our approach uses knowledge graphs implicitly, only during pre-training.
arXiv Detail & Related papers (2020-04-29T14:22:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.