Relational Learning with Gated and Attentive Neighbor Aggregator for
Few-Shot Knowledge Graph Completion
- URL: http://arxiv.org/abs/2104.13095v1
- Date: Tue, 27 Apr 2021 10:38:44 GMT
- Title: Relational Learning with Gated and Attentive Neighbor Aggregator for
Few-Shot Knowledge Graph Completion
- Authors: Guanglin Niu, Yang Li, Chengguang Tang, Ruiying Geng, Jian Dai, Qiao
Liu, Hao Wang, Jian Sun, Fei Huang, Luo Si
- Abstract summary: We propose a few-shot relational learning with global-local framework to address the above issues.
For the local stage, a meta-learning based TransH method is designed to model complex relations and train our model in a few-shot learning fashion.
Our model achieves 5-shot FKGC performance improvements of 8.0% on NELL-One and 2.8% on Wiki-One by the metric Hits@10.
- Score: 33.59045268013895
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Aiming at expanding few-shot relations' coverage in knowledge graphs (KGs),
few-shot knowledge graph completion (FKGC) has recently gained more research
interests. Some existing models employ a few-shot relation's multi-hop neighbor
information to enhance its semantic representation. However, noise neighbor
information might be amplified when the neighborhood is excessively sparse and
no neighbor is available to represent the few-shot relation. Moreover, modeling
and inferring complex relations of one-to-many (1-N), many-to-one (N-1), and
many-to-many (N-N) by previous knowledge graph completion approaches requires
high model complexity and a large amount of training instances. Thus, inferring
complex relations in the few-shot scenario is difficult for FKGC models due to
limited training instances. In this paper, we propose a few-shot relational
learning with global-local framework to address the above issues. At the global
stage, a novel gated and attentive neighbor aggregator is built for accurately
integrating the semantics of a few-shot relation's neighborhood, which helps
filtering the noise neighbors even if a KG contains extremely sparse
neighborhoods. For the local stage, a meta-learning based TransH (MTransH)
method is designed to model complex relations and train our model in a few-shot
learning fashion. Extensive experiments show that our model outperforms the
state-of-the-art FKGC approaches on the frequently-used benchmark datasets
NELL-One and Wiki-One. Compared with the strong baseline model MetaR, our model
achieves 5-shot FKGC performance improvements of 8.0% on NELL-One and 2.8% on
Wiki-One by the metric Hits@10.
Related papers
- ReasoningLM: Enabling Structural Subgraph Reasoning in Pre-trained
Language Models for Question Answering over Knowledge Graph [142.42275983201978]
We propose a subgraph-aware self-attention mechanism to imitate the GNN for performing structured reasoning.
We also adopt an adaptation tuning strategy to adapt the model parameters with 20,000 subgraphs with synthesized questions.
Experiments show that ReasoningLM surpasses state-of-the-art models by a large margin, even with fewer updated parameters and less training data.
arXiv Detail & Related papers (2023-12-30T07:18:54Z) - RelBERT: Embedding Relations with Language Models [29.528217625083546]
We propose to extract relation embeddings from relatively small language models.
RelBERT captures relational similarity in a surprisingly fine-grained way.
It is capable of modelling relations that go well beyond what the model has seen during training.
arXiv Detail & Related papers (2023-09-30T08:15:36Z) - A RelEntLess Benchmark for Modelling Graded Relations between Named
Entities [29.528217625083546]
We introduce a new benchmark, in which entity pairs have to be ranked according to how much they satisfy a given graded relation.
We find a strong correlation between model size and performance, with smaller Language Models struggling to outperform a naive baseline.
The results of the largest Flan-T5 and OPT models are remarkably strong, although a clear gap with human performance remains.
arXiv Detail & Related papers (2023-05-24T10:41:24Z) - Normalizing Flow-based Neural Process for Few-Shot Knowledge Graph
Completion [69.55700751102376]
Few-shot knowledge graph completion (FKGC) aims to predict missing facts for unseen relations with few-shot associated facts.
Existing FKGC methods are based on metric learning or meta-learning, which often suffer from the out-of-distribution and overfitting problems.
In this paper, we propose a normalizing flow-based neural process for few-shot knowledge graph completion (NP-FKGC)
arXiv Detail & Related papers (2023-04-17T11:42:28Z) - UniKGQA: Unified Retrieval and Reasoning for Solving Multi-hop Question
Answering Over Knowledge Graph [89.98762327725112]
Multi-hop Question Answering over Knowledge Graph(KGQA) aims to find the answer entities that are multiple hops away from the topic entities mentioned in a natural language question.
We propose UniKGQA, a novel approach for multi-hop KGQA task, by unifying retrieval and reasoning in both model architecture and parameter learning.
arXiv Detail & Related papers (2022-12-02T04:08:09Z) - Hierarchical Relational Learning for Few-Shot Knowledge Graph Completion [25.905974480733562]
We propose a hierarchical relational learning method (HiRe) for few-shot KG completion.
By jointly capturing three levels of relational information, HiRe can effectively learn and refine the meta representation of few-shot relations.
Experiments on two benchmark datasets validate the superiority of HiRe against other state-of-the-art methods.
arXiv Detail & Related papers (2022-09-02T17:57:03Z) - RMNA: A Neighbor Aggregation-Based Knowledge Graph Representation
Learning Model Using Rule Mining [9.702290899930608]
Neighbor aggregation-based representation learning (NARL) models are proposed, which encode the information in the neighbors of an entity into its embeddings.
We propose a NARL model named RMNA, which obtains and filters horn rules through a rule mining algorithm, and uses selected horn rules to transform valuable multi-hop neighbors into one-hop neighbors.
arXiv Detail & Related papers (2021-11-01T02:08:26Z) - Integrating Semantics and Neighborhood Information with Graph-Driven
Generative Models for Document Retrieval [51.823187647843945]
In this paper, we encode the neighborhood information with a graph-induced Gaussian distribution, and propose to integrate the two types of information with a graph-driven generative model.
Under the approximation, we prove that the training objective can be decomposed into terms involving only singleton or pairwise documents, enabling the model to be trained as efficiently as uncorrelated ones.
arXiv Detail & Related papers (2021-05-27T11:29:03Z) - Learning Intents behind Interactions with Knowledge Graph for
Recommendation [93.08709357435991]
Knowledge graph (KG) plays an increasingly important role in recommender systems.
Existing GNN-based models fail to identify user-item relation at a fine-grained level of intents.
We propose a new model, Knowledge Graph-based Intent Network (KGIN)
arXiv Detail & Related papers (2021-02-14T03:21:36Z) - A Frustratingly Easy Approach for Entity and Relation Extraction [25.797992240847833]
We present a simple pipelined approach for entity and relation extraction.
We establish the new state-of-the-art on standard benchmarks (ACE04, ACE05 and SciERC)
Our approach essentially builds on two independent encoders and merely uses the entity model to construct the input for the relation model.
arXiv Detail & Related papers (2020-10-24T07:14:01Z) - A Simple Approach to Case-Based Reasoning in Knowledge Bases [56.661396189466664]
We present a surprisingly simple yet accurate approach to reasoning in knowledge graphs (KGs) that requires emphno training, and is reminiscent of case-based reasoning in classical artificial intelligence (AI)
Consider the task of finding a target entity given a source entity and a binary relation.
Our non-parametric approach derives crisp logical rules for each query by finding multiple textitgraph path patterns that connect similar source entities through the given relation.
arXiv Detail & Related papers (2020-06-25T06:28:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.