Reasoning Through Memorization: Nearest Neighbor Knowledge Graph
Embeddings
- URL: http://arxiv.org/abs/2201.05575v4
- Date: Sun, 30 Jul 2023 11:25:49 GMT
- Title: Reasoning Through Memorization: Nearest Neighbor Knowledge Graph
Embeddings
- Authors: Peng Wang, Xin Xie, Xiaohan Wang, Ningyu Zhang
- Abstract summary: kNN-KGE is a new knowledge graph embedding approach with pre-trained language models.
We compute the nearest neighbors based on the distance in the entity embedding space from the knowledge store.
- Score: 29.94706167233985
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Previous knowledge graph embedding approaches usually map entities to
representations and utilize score functions to predict the target entities, yet
they typically struggle to reason rare or emerging unseen entities. In this
paper, we propose kNN-KGE, a new knowledge graph embedding approach with
pre-trained language models, by linearly interpolating its entity distribution
with k-nearest neighbors. We compute the nearest neighbors based on the
distance in the entity embedding space from the knowledge store. Our approach
can allow rare or emerging entities to be memorized explicitly rather than
implicitly in model parameters. Experimental results demonstrate that our
approach can improve inductive and transductive link prediction results and
yield better performance for low-resource settings with only a few triples,
which might be easier to reason via explicit memory. Code is available at
https://github.com/zjunlp/KNN-KG.
Related papers
- Inference over Unseen Entities, Relations and Literals on Knowledge Graphs [1.7474352892977463]
knowledge graph embedding models have been successfully applied in the transductive setting to tackle various challenging tasks.
We propose the attentive byte-pair encoding layer (BytE) to construct a triple embedding from a sequence of byte-pair encoded subword units of entities and relations.
BytE leads to massive feature reuse via weight tying, since it forces a knowledge graph embedding model to learn embeddings for subword units instead of entities and relations directly.
arXiv Detail & Related papers (2024-10-09T10:20:54Z) - Normalizing Flow-based Neural Process for Few-Shot Knowledge Graph
Completion [69.55700751102376]
Few-shot knowledge graph completion (FKGC) aims to predict missing facts for unseen relations with few-shot associated facts.
Existing FKGC methods are based on metric learning or meta-learning, which often suffer from the out-of-distribution and overfitting problems.
In this paper, we propose a normalizing flow-based neural process for few-shot knowledge graph completion (NP-FKGC)
arXiv Detail & Related papers (2023-04-17T11:42:28Z) - Entity-Agnostic Representation Learning for Parameter-Efficient
Knowledge Graph Embedding [30.7075844882004]
We propose an entity-agnostic representation learning method for handling the problem of inefficient parameter storage costs brought by embedding knowledge graphs.
We learn universal and entity-agnostic encoders for transforming distinguishable information into entity embeddings.
Experimental results show that EARL uses fewer parameters and performs better on link prediction tasks than baselines.
arXiv Detail & Related papers (2023-02-03T16:49:46Z) - Subgraph Neighboring Relations Infomax for Inductive Link Prediction on
Knowledge Graphs [4.096203468876652]
Subgraph Neighboring Relations Infomax, SNRI, exploits complete neighboring relations from two aspects.
Experiments show SNRI outperforms existing state-of-art methods by a large margin on inductive link prediction task.
arXiv Detail & Related papers (2022-07-28T01:52:39Z) - Learning Intents behind Interactions with Knowledge Graph for
Recommendation [93.08709357435991]
Knowledge graph (KG) plays an increasingly important role in recommender systems.
Existing GNN-based models fail to identify user-item relation at a fine-grained level of intents.
We propose a new model, Knowledge Graph-based Intent Network (KGIN)
arXiv Detail & Related papers (2021-02-14T03:21:36Z) - Adversarial Examples for $k$-Nearest Neighbor Classifiers Based on
Higher-Order Voronoi Diagrams [69.4411417775822]
Adversarial examples are a widely studied phenomenon in machine learning models.
We propose an algorithm for evaluating the adversarial robustness of $k$-nearest neighbor classification.
arXiv Detail & Related papers (2020-11-19T08:49:10Z) - Explaining and Improving Model Behavior with k Nearest Neighbor
Representations [107.24850861390196]
We propose using k nearest neighbor representations to identify training examples responsible for a model's predictions.
We show that kNN representations are effective at uncovering learned spurious associations.
Our results indicate that the kNN approach makes the finetuned model more robust to adversarial inputs.
arXiv Detail & Related papers (2020-10-18T16:55:25Z) - Interpreting Graph Neural Networks for NLP With Differentiable Edge
Masking [63.49779304362376]
Graph neural networks (GNNs) have become a popular approach to integrating structural inductive biases into NLP models.
We introduce a post-hoc method for interpreting the predictions of GNNs which identifies unnecessary edges.
We show that we can drop a large proportion of edges without deteriorating the performance of the model.
arXiv Detail & Related papers (2020-10-01T17:51:19Z) - A Simple Approach to Case-Based Reasoning in Knowledge Bases [56.661396189466664]
We present a surprisingly simple yet accurate approach to reasoning in knowledge graphs (KGs) that requires emphno training, and is reminiscent of case-based reasoning in classical artificial intelligence (AI)
Consider the task of finding a target entity given a source entity and a binary relation.
Our non-parametric approach derives crisp logical rules for each query by finding multiple textitgraph path patterns that connect similar source entities through the given relation.
arXiv Detail & Related papers (2020-06-25T06:28:09Z) - PushNet: Efficient and Adaptive Neural Message Passing [1.9121961872220468]
Message passing neural networks have recently evolved into a state-of-the-art approach to representation learning on graphs.
Existing methods perform synchronous message passing along all edges in multiple subsequent rounds.
We consider a novel asynchronous message passing approach where information is pushed only along the most relevant edges until convergence.
arXiv Detail & Related papers (2020-03-04T18:15:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.