NePTuNe: Neural Powered Tucker Network for Knowledge Graph Completion
- URL: http://arxiv.org/abs/2104.07824v1
- Date: Thu, 15 Apr 2021 23:48:26 GMT
- Title: NePTuNe: Neural Powered Tucker Network for Knowledge Graph Completion
- Authors: Shashank Sonkar, Arzoo Katiyar and Richard G. Baraniuk
- Abstract summary: We propose a new hybrid link prediction model that couples the expressiveness of deep models with the speed and size of linear models.
NePTuNe provides state-of-the-art performance on the FB15K-237 dataset and near state-of-the-art performance on the WN18RR dataset.
- Score: 31.838865331557496
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Knowledge graphs link entities through relations to provide a structured
representation of real world facts. However, they are often incomplete, because
they are based on only a small fraction of all plausible facts. The task of
knowledge graph completion via link prediction aims to overcome this challenge
by inferring missing facts represented as links between entities. Current
approaches to link prediction leverage tensor factorization and/or deep
learning. Factorization methods train and deploy rapidly thanks to their small
number of parameters but have limited expressiveness due to their underlying
linear methodology. Deep learning methods are more expressive but also
computationally expensive and prone to overfitting due to their large number of
trainable parameters. We propose Neural Powered Tucker Network (NePTuNe), a new
hybrid link prediction model that couples the expressiveness of deep models
with the speed and size of linear models. We demonstrate that NePTuNe provides
state-of-the-art performance on the FB15K-237 dataset and near state-of-the-art
performance on the WN18RR dataset.
Related papers
- Normalizing Flow-based Neural Process for Few-Shot Knowledge Graph
Completion [69.55700751102376]
Few-shot knowledge graph completion (FKGC) aims to predict missing facts for unseen relations with few-shot associated facts.
Existing FKGC methods are based on metric learning or meta-learning, which often suffer from the out-of-distribution and overfitting problems.
In this paper, we propose a normalizing flow-based neural process for few-shot knowledge graph completion (NP-FKGC)
arXiv Detail & Related papers (2023-04-17T11:42:28Z) - Beyond spectral gap (extended): The role of the topology in
decentralized learning [58.48291921602417]
In data-parallel optimization of machine learning models, workers collaborate to improve their estimates of the model.
Current theory does not explain that collaboration enables larger learning rates than training alone.
This paper aims to paint an accurate picture of sparsely-connected distributed optimization.
arXiv Detail & Related papers (2023-01-05T16:53:38Z) - Beyond spectral gap: The role of the topology in decentralized learning [58.48291921602417]
In data-parallel optimization of machine learning models, workers collaborate to improve their estimates of the model.
This paper aims to paint an accurate picture of sparsely-connected distributed optimization when workers share the same data distribution.
Our theory matches empirical observations in deep learning, and accurately describes the relative merits of different graph topologies.
arXiv Detail & Related papers (2022-06-07T08:19:06Z) - Learning Representations of Entities and Relations [0.0]
This thesis focuses on improving knowledge graph representation with the aim of tackling the link prediction task.
The first contribution is HypER, a convolutional model which simplifies and improves upon the link prediction performance.
The second contribution is TuckER, a relatively straightforward linear model, which, at the time of its introduction, obtained state-of-the-art link prediction performance.
The third contribution is MuRP, first multi-relational graph representation model embedded in hyperbolic space.
arXiv Detail & Related papers (2022-01-31T09:24:43Z) - How Neural Processes Improve Graph Link Prediction [35.652234989200956]
We propose a meta-learning approach with graph neural networks for link prediction: Neural Processes for Graph Neural Networks (NPGNN)
NPGNN can perform both transductive and inductive learning tasks and adapt to patterns in a large new graph after training with a small subgraph.
arXiv Detail & Related papers (2021-09-30T07:35:13Z) - KGRefiner: Knowledge Graph Refinement for Improving Accuracy of
Translational Link Prediction Methods [4.726777092009553]
This paper proposes a method for refining the knowledge graph.
It makes the knowledge graph more informative, and link prediction operations can be performed more accurately.
Our experiments show that our method can significantly increase the performance of translational link prediction methods.
arXiv Detail & Related papers (2021-06-27T13:32:39Z) - LowFER: Low-rank Bilinear Pooling for Link Prediction [4.110108749051657]
We propose a factorized bilinear pooling model, commonly used in multi-modal learning, for better fusion of entities and relations.
Our model naturally generalizes decomposition Tucker based TuckER model, which has been shown to generalize other models.
We evaluate on real-world datasets, reaching on par or state-of-the-art performance.
arXiv Detail & Related papers (2020-08-25T07:33:52Z) - Learning Reasoning Strategies in End-to-End Differentiable Proving [50.9791149533921]
Conditional Theorem Provers learn optimal rule selection strategy via gradient-based optimisation.
We show that Conditional Theorem Provers are scalable and yield state-of-the-art results on the CLUTRR dataset.
arXiv Detail & Related papers (2020-07-13T16:22:14Z) - Learning to Extrapolate Knowledge: Transductive Few-shot Out-of-Graph
Link Prediction [69.1473775184952]
We introduce a realistic problem of few-shot out-of-graph link prediction.
We tackle this problem with a novel transductive meta-learning framework.
We validate our model on multiple benchmark datasets for knowledge graph completion and drug-drug interaction prediction.
arXiv Detail & Related papers (2020-06-11T17:42:46Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.