A Review of Knowledge Graph Completion
- URL: http://arxiv.org/abs/2208.11652v1
- Date: Wed, 24 Aug 2022 16:42:59 GMT
- Title: A Review of Knowledge Graph Completion
- Authors: Mohamad Zamini, Hassan Reza, Minou Rabiei
- Abstract summary: Information extraction methods proved to be effective at triple extraction from structured or unstructured data.
Most of the current knowledge graphs are incomplete.
In order to use KGs in downstream tasks, it is desirable to predict missing links in KGs.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Information extraction methods proved to be effective at triple extraction
from structured or unstructured data. The organization of such triples in the
form of (head entity, relation, tail entity) is called the construction of
Knowledge Graphs (KGs). Most of the current knowledge graphs are incomplete. In
order to use KGs in downstream tasks, it is desirable to predict missing links
in KGs. Different approaches have been recently proposed for representation
learning of KGs by embedding both entities and relations into a low-dimensional
vector space aiming to predict unknown triples based on previously visited
triples. According to how the triples will be treated independently or
dependently, we divided the task of knowledge graph completion into
conventional and graph neural network representation learning and we discuss
them in more detail. In conventional approaches, each triple will be processed
independently and in GNN-based approaches, triples also consider their local
neighborhood. View Full-Text
Related papers
- Few-shot Knowledge Graph Relational Reasoning via Subgraph Adaptation [51.47994645529258]
Few-shot Knowledge Graph (KG) Reasoning aims to predict unseen triplets (i.e., query triplets) for rare relations in KGs.
We propose SAFER (Subgraph Adaptation for Few-shot Reasoning), a novel approach that effectively adapts the information in contextualized graphs to various subgraphs.
arXiv Detail & Related papers (2024-06-19T21:40:35Z) - Knowledge Graph Completion using Structural and Textual Embeddings [0.0]
We propose a relations prediction model that harnesses both textual and structural information within Knowledge Graphs.
Our approach integrates walks-based embeddings with language model embeddings to effectively represent nodes.
We demonstrate that our model achieves competitive results in the relation prediction task when evaluated on a widely used dataset.
arXiv Detail & Related papers (2024-04-24T21:04:14Z) - Normalizing Flow-based Neural Process for Few-Shot Knowledge Graph
Completion [69.55700751102376]
Few-shot knowledge graph completion (FKGC) aims to predict missing facts for unseen relations with few-shot associated facts.
Existing FKGC methods are based on metric learning or meta-learning, which often suffer from the out-of-distribution and overfitting problems.
In this paper, we propose a normalizing flow-based neural process for few-shot knowledge graph completion (NP-FKGC)
arXiv Detail & Related papers (2023-04-17T11:42:28Z) - Walk-and-Relate: A Random-Walk-based Algorithm for Representation
Learning on Sparse Knowledge Graphs [5.444459446244819]
We propose an efficient method to augment the number of triplets to address the problem of data sparsity.
We also provide approaches to accurately and efficiently filter out informative metapaths from the possible set of metapaths.
The proposed approaches are model-agnostic, and the augmented training dataset can be used with any KG embedding approach out of the box.
arXiv Detail & Related papers (2022-09-19T05:35:23Z) - Repurposing Knowledge Graph Embeddings for Triple Representation via
Weak Supervision [77.34726150561087]
Current methods learn triple embeddings from scratch without utilizing entity and predicate embeddings from pre-trained models.
We develop a method for automatically sampling triples from a knowledge graph and estimating their pairwise similarities from pre-trained embedding models.
These pairwise similarity scores are then fed to a Siamese-like neural architecture to fine-tune triple representations.
arXiv Detail & Related papers (2022-08-22T14:07:08Z) - Triple Classification for Scholarly Knowledge Graph Completion [1.9322973059079729]
We present exBERT, a method for leveraging pre-trained transformer language models to perform scholarly knowledge graph completion.
The evaluation shows that exBERT outperforms other baselines in the tasks of triple classification, link prediction, and relation prediction.
We present two scholarly datasets as resources for the research community, collected from public KGs and online resources.
arXiv Detail & Related papers (2021-11-23T13:16:31Z) - Learning Intents behind Interactions with Knowledge Graph for
Recommendation [93.08709357435991]
Knowledge graph (KG) plays an increasingly important role in recommender systems.
Existing GNN-based models fail to identify user-item relation at a fine-grained level of intents.
We propose a new model, Knowledge Graph-based Intent Network (KGIN)
arXiv Detail & Related papers (2021-02-14T03:21:36Z) - FedE: Embedding Knowledge Graphs in Federated Setting [21.022513922373207]
Multi-Source KG is a common situation in real Knowledge Graph applications.
Because of the data privacy and sensitivity, a set of relevant knowledge graphs cannot complement each other's KGC by just collecting data from different knowledge graphs together.
We propose a Federated Knowledge Graph Embedding framework FedE, focusing on learning knowledge graph embeddings by aggregating locally-computed updates.
arXiv Detail & Related papers (2020-10-24T11:52:05Z) - Structure-Augmented Text Representation Learning for Efficient Knowledge
Graph Completion [53.31911669146451]
Human-curated knowledge graphs provide critical supportive information to various natural language processing tasks.
These graphs are usually incomplete, urging auto-completion of them.
graph embedding approaches, e.g., TransE, learn structured knowledge via representing graph elements into dense embeddings.
textual encoding approaches, e.g., KG-BERT, resort to graph triple's text and triple-level contextualized representations.
arXiv Detail & Related papers (2020-04-30T13:50:34Z) - Generative Adversarial Zero-Shot Relational Learning for Knowledge
Graphs [96.73259297063619]
We consider a novel formulation, zero-shot learning, to free this cumbersome curation.
For newly-added relations, we attempt to learn their semantic features from their text descriptions.
We leverage Generative Adrial Networks (GANs) to establish the connection between text and knowledge graph domain.
arXiv Detail & Related papers (2020-01-08T01:19:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.