InGram: Inductive Knowledge Graph Embedding via Relation Graphs
- URL: http://arxiv.org/abs/2305.19987v3
- Date: Thu, 17 Aug 2023 14:08:26 GMT
- Title: InGram: Inductive Knowledge Graph Embedding via Relation Graphs
- Authors: Jaejun Lee, Chanyoung Chung, Joyce Jiyoung Whang
- Abstract summary: In this paper, we propose an INductive knowledge GRAph eMbedding method, InGram, that can generate embeddings of new relations as well as new entities at inference time.
Experimental results show that InGram outperforms 14 different state-of-the-art methods on varied inductive learning scenarios.
- Score: 16.005051393690792
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Inductive knowledge graph completion has been considered as the task of
predicting missing triplets between new entities that are not observed during
training. While most inductive knowledge graph completion methods assume that
all entities can be new, they do not allow new relations to appear at inference
time. This restriction prohibits the existing methods from appropriately
handling real-world knowledge graphs where new entities accompany new
relations. In this paper, we propose an INductive knowledge GRAph eMbedding
method, InGram, that can generate embeddings of new relations as well as new
entities at inference time. Given a knowledge graph, we define a relation graph
as a weighted graph consisting of relations and the affinity weights between
them. Based on the relation graph and the original knowledge graph, InGram
learns how to aggregate neighboring embeddings to generate relation and entity
embeddings using an attention mechanism. Experimental results show that InGram
outperforms 14 different state-of-the-art methods on varied inductive learning
scenarios.
Related papers
- Inference over Unseen Entities, Relations and Literals on Knowledge Graphs [1.7474352892977463]
knowledge graph embedding models have been successfully applied in the transductive setting to tackle various challenging tasks.
We propose the attentive byte-pair encoding layer (BytE) to construct a triple embedding from a sequence of byte-pair encoded subword units of entities and relations.
BytE leads to massive feature reuse via weight tying, since it forces a knowledge graph embedding model to learn embeddings for subword units instead of entities and relations directly.
arXiv Detail & Related papers (2024-10-09T10:20:54Z) - Extending Transductive Knowledge Graph Embedding Models for Inductive
Logical Relational Inference [0.5439020425819]
This work bridges the gap between traditional transductive knowledge graph embedding approaches and more recent inductive relation prediction models.
We introduce a generalized form of harmonic extension which leverages representations learned through transductive embedding methods to infer representations of new entities introduced at inference time as in the inductive setting.
In experiments on a number of large-scale knowledge graph embedding benchmarks, we find that this approach for extending the functionality of transductive knowledge graph embedding models is competitive with--and in some scenarios outperforms--several state-of-the-art models derived explicitly for such inductive tasks.
arXiv Detail & Related papers (2023-09-07T15:24:18Z) - Graph Relation Aware Continual Learning [3.908470250825618]
Continual graph learning (CGL) studies the problem of learning from an infinite stream of graph data.
We design a relation-aware adaptive model, dubbed as RAM-CG, that consists of a relation-discovery modular to explore latent relations behind edges.
RAM-CG provides significant 2.2%, 6.9% and 6.6% accuracy improvements over the state-of-the-art results on CitationNet, OGBN-arxiv and TWITCH dataset.
arXiv Detail & Related papers (2023-08-16T09:53:20Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Unbiased Graph Embedding with Biased Graph Observations [52.82841737832561]
We propose a principled new way for obtaining unbiased representations by learning from an underlying bias-free graph.
Based on this new perspective, we propose two complementary methods for uncovering such an underlying graph.
arXiv Detail & Related papers (2021-10-26T18:44:37Z) - Knowledge Sheaves: A Sheaf-Theoretic Framework for Knowledge Graph
Embedding [1.5469452301122175]
We show that knowledge graph embedding is naturally expressed in the topological and categorical language of textitcellular sheaves
A knowledge graph embedding can be described as an approximate global section of an appropriate textitknowledge sheaf over the graph.
The resulting embeddings can be easily adapted for reasoning over composite relations without special training.
arXiv Detail & Related papers (2021-10-07T20:54:40Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z) - Relational Message Passing for Knowledge Graph Completion [78.47976646383222]
We propose a relational message passing method for knowledge graph completion.
It passes relational messages among edges iteratively to aggregate neighborhood information.
Results show our method outperforms stateof-the-art knowledge completion methods by a large margin.
arXiv Detail & Related papers (2020-02-17T03:33:41Z) - Generative Adversarial Zero-Shot Relational Learning for Knowledge
Graphs [96.73259297063619]
We consider a novel formulation, zero-shot learning, to free this cumbersome curation.
For newly-added relations, we attempt to learn their semantic features from their text descriptions.
We leverage Generative Adrial Networks (GANs) to establish the connection between text and knowledge graph domain.
arXiv Detail & Related papers (2020-01-08T01:19:08Z) - Bridging Knowledge Graphs to Generate Scene Graphs [49.69377653925448]
We propose a novel graph-based neural network that iteratively propagates information between the two graphs, as well as within each of them.
Our Graph Bridging Network, GB-Net, successively infers edges and nodes, allowing to simultaneously exploit and refine the rich, heterogeneous structure of the interconnected scene and commonsense graphs.
arXiv Detail & Related papers (2020-01-07T23:35:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.