Universal Preprocessing Operators for Embedding Knowledge Graphs with
Literals
- URL: http://arxiv.org/abs/2309.03023v1
- Date: Wed, 6 Sep 2023 14:08:46 GMT
- Title: Universal Preprocessing Operators for Embedding Knowledge Graphs with
Literals
- Authors: Patryk Preisner, Heiko Paulheim
- Abstract summary: Knowledge graph embeddings are dense numerical representations of entities in a knowledge graph (KG)
We propose a set of universal preprocessing operators which can be used to transform KGs with literals for numerical, temporal, textual, and image information.
The results on the kgbench dataset with three different embedding methods show promising results.
- Score: 2.211868306499727
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Knowledge graph embeddings are dense numerical representations of entities in
a knowledge graph (KG). While the majority of approaches concentrate only on
relational information, i.e., relations between entities, fewer approaches
exist which also take information about literal values (e.g., textual
descriptions or numerical information) into account. Those which exist are
typically tailored towards a particular modality of literal and a particular
embedding method. In this paper, we propose a set of universal preprocessing
operators which can be used to transform KGs with literals for numerical,
temporal, textual, and image information, so that the transformed KGs can be
embedded with any method. The results on the kgbench dataset with three
different embedding methods show promising results.
Related papers
- Inference over Unseen Entities, Relations and Literals on Knowledge Graphs [1.7474352892977463]
knowledge graph embedding models have been successfully applied in the transductive setting to tackle various challenging tasks.
We propose the attentive byte-pair encoding layer (BytE) to construct a triple embedding from a sequence of byte-pair encoded subword units of entities and relations.
BytE leads to massive feature reuse via weight tying, since it forces a knowledge graph embedding model to learn embeddings for subword units instead of entities and relations directly.
arXiv Detail & Related papers (2024-10-09T10:20:54Z) - Text-To-KG Alignment: Comparing Current Methods on Classification Tasks [2.191505742658975]
knowledge graphs (KG) provide dense and structured representations of factual information.
Recent work has focused on creating pipeline models that retrieve information from KGs as additional context.
It is not known how current methods compare to a scenario where the aligned subgraph is completely relevant to the query.
arXiv Detail & Related papers (2023-06-05T13:45:45Z) - Learning Representations without Compositional Assumptions [79.12273403390311]
We propose a data-driven approach that learns feature set dependencies by representing feature sets as graph nodes and their relationships as learnable edges.
We also introduce LEGATO, a novel hierarchical graph autoencoder that learns a smaller, latent graph to aggregate information from multiple views dynamically.
arXiv Detail & Related papers (2023-05-31T10:36:10Z) - Conversational Semantic Parsing using Dynamic Context Graphs [68.72121830563906]
We consider the task of conversational semantic parsing over general purpose knowledge graphs (KGs) with millions of entities, and thousands of relation-types.
We focus on models which are capable of interactively mapping user utterances into executable logical forms.
arXiv Detail & Related papers (2023-05-04T16:04:41Z) - Joint Language Semantic and Structure Embedding for Knowledge Graph
Completion [66.15933600765835]
We propose to jointly embed the semantics in the natural language description of the knowledge triplets with their structure information.
Our method embeds knowledge graphs for the completion task via fine-tuning pre-trained language models.
Our experiments on a variety of knowledge graph benchmarks have demonstrated the state-of-the-art performance of our method.
arXiv Detail & Related papers (2022-09-19T02:41:02Z) - StarGraph: A Coarse-to-Fine Representation Method for Large-Scale
Knowledge Graph [0.6445605125467573]
We propose a method named StarGraph, which gives a novel way to utilize the neighborhood information for large-scale knowledge graphs.
The proposed method achieves the best results on the ogbl-wikikg2 dataset, which validates the effectiveness of it.
arXiv Detail & Related papers (2022-05-27T19:32:45Z) - Knowledge Graph Completion with Text-aided Regularization [2.8361571014635407]
Knowledge Graph Completion is a task of expanding the knowledge graph/base through estimating possible entities.
Traditional approaches mainly focus on using the existing graphical information that is intrinsic of the graph.
We try numerous ways of using extracted or raw textual information to help existing KG embedding frameworks reach better prediction results.
arXiv Detail & Related papers (2021-01-22T06:10:09Z) - ENT-DESC: Entity Description Generation by Exploring Knowledge Graph [53.03778194567752]
In practice, the input knowledge could be more than enough, since the output description may only cover the most significant knowledge.
We introduce a large-scale and challenging dataset to facilitate the study of such a practical scenario in KG-to-text.
We propose a multi-graph structure that is able to represent the original graph information more comprehensively.
arXiv Detail & Related papers (2020-04-30T14:16:19Z) - Structure-Augmented Text Representation Learning for Efficient Knowledge
Graph Completion [53.31911669146451]
Human-curated knowledge graphs provide critical supportive information to various natural language processing tasks.
These graphs are usually incomplete, urging auto-completion of them.
graph embedding approaches, e.g., TransE, learn structured knowledge via representing graph elements into dense embeddings.
textual encoding approaches, e.g., KG-BERT, resort to graph triple's text and triple-level contextualized representations.
arXiv Detail & Related papers (2020-04-30T13:50:34Z) - Exploiting Structured Knowledge in Text via Graph-Guided Representation
Learning [73.0598186896953]
We present two self-supervised tasks learning over raw text with the guidance from knowledge graphs.
Building upon entity-level masked language models, our first contribution is an entity masking scheme.
In contrast to existing paradigms, our approach uses knowledge graphs implicitly, only during pre-training.
arXiv Detail & Related papers (2020-04-29T14:22:42Z) - Entity Type Prediction in Knowledge Graphs using Embeddings [2.7528170226206443]
Open Knowledge Graphs (such as DBpedia, Wikidata, YAGO) have been recognized as the backbone of diverse applications in the field of data mining and information retrieval.
Most of these KGs are mostly created either via an automated information extraction from snapshots or information accumulation provided by the users or using Wikipedias.
It has been observed that the type information of these KGs is often noisy, incomplete, and incorrect.
A multi-label classification approach is proposed in this work for entity typing using KG embeddings.
arXiv Detail & Related papers (2020-04-28T17:57:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.