Few-shot Link Prediction on N-ary Facts
- URL: http://arxiv.org/abs/2305.06104v3
- Date: Tue, 2 Apr 2024 07:11:01 GMT
- Title: Few-shot Link Prediction on N-ary Facts
- Authors: Jiyao Wei, Saiping Guan, Xiaolong Jin, Jiafeng Guo, Xueqi Cheng,
- Abstract summary: Link Prediction on Hyper-relational Facts (LPHFs) is to predict a missing element in a hyper-relational fact.
Few-Shot Link Prediction on Hyper-relational Facts (PHFs) aims to predict a missing entity in a hyper-relational fact with limited support instances.
- Score: 70.8150181683017
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hyper-relational facts, which consist of a primary triple (head entity, relation, tail entity) and auxiliary attribute-value pairs, are widely present in real-world Knowledge Graphs (KGs). Link Prediction on Hyper-relational Facts (LPHFs) is to predict a missing element in a hyper-relational fact, which helps populate and enrich KGs. However, existing LPHFs studies usually require an amount of high-quality data. They overlook few-shot relations, which have limited instances, yet are common in real-world scenarios. Thus, we introduce a new task, Few-Shot Link Prediction on Hyper-relational Facts (FSLPHFs). It aims to predict a missing entity in a hyper-relational fact with limited support instances. To tackle FSLPHFs, we propose MetaRH, a model that learns Meta Relational information in Hyper-relational facts. MetaRH comprises three modules: relation learning, support-specific adjustment, and query inference. By capturing meta relational information from limited support instances, MetaRH can accurately predict the missing entity in a query. As there is no existing dataset available for this new task, we construct three datasets to validate the effectiveness of MetaRH. Experimental results on these datasets demonstrate that MetaRH significantly outperforms existing representative models.
Related papers
- UniHR: Hierarchical Representation Learning for Unified Knowledge Graph Link Prediction [41.46369433488762]
We propose a unified Hierarchical Representation learning framework (UniHR) for unified knowledge graph link prediction.
It consists of a unified Hierarchical Data Representation (HiDR) module and a unified Hierarchical Structure Learning (HiSL) module as graph encoder.
We show that our UniHR outperforms baselines designed for one specific kind of KG, indicating strong generalization capability of HiDR form and the effectiveness of HiSL module.
arXiv Detail & Related papers (2024-11-11T14:22:42Z) - HyperMono: A Monotonicity-aware Approach to Hyper-Relational Knowledge Representation [27.28214706269035]
In a hyper-relational knowledge graph (HKG), each fact is composed of a main triple associated with attribute-value qualifiers, which express additional factual knowledge.
This paper proposes the HyperMono model for hyper-relational knowledge graph completion, which realizes stage reasoning and qualifier monotonicity.
arXiv Detail & Related papers (2024-04-15T15:00:17Z) - Normalizing Flow-based Neural Process for Few-Shot Knowledge Graph
Completion [69.55700751102376]
Few-shot knowledge graph completion (FKGC) aims to predict missing facts for unseen relations with few-shot associated facts.
Existing FKGC methods are based on metric learning or meta-learning, which often suffer from the out-of-distribution and overfitting problems.
In this paper, we propose a normalizing flow-based neural process for few-shot knowledge graph completion (NP-FKGC)
arXiv Detail & Related papers (2023-04-17T11:42:28Z) - A Dataset for Hyper-Relational Extraction and a Cube-Filling Approach [59.89749342550104]
We propose the task of hyper-relational extraction to extract more specific and complete facts from text.
Existing models cannot perform hyper-relational extraction as it requires a model to consider the interaction between three entities.
We propose CubeRE, a cube-filling model inspired by table-filling approaches and explicitly considers the interaction between relation triplets and qualifiers.
arXiv Detail & Related papers (2022-11-18T03:51:28Z) - Learning Representations for Hyper-Relational Knowledge Graphs [35.380689788802776]
We design a framework to learn representations for hyper-relational facts using multiple aggregators.
Experiments demonstrate the effectiveness of our framework across multiple datasets.
We conduct an ablation study that validates the importance of the various components in our framework.
arXiv Detail & Related papers (2022-08-30T15:02:14Z) - Link Prediction on N-ary Relational Data Based on Relatedness Evaluation [61.61555159755858]
We propose a method called NaLP to conduct link prediction on n-ary relational data.
We represent each n-ary relational fact as a set of its role and role-value pairs.
Experimental results validate the effectiveness and merits of the proposed methods.
arXiv Detail & Related papers (2021-04-21T09:06:54Z) - Exploring the Limits of Few-Shot Link Prediction in Knowledge Graphs [49.6661602019124]
We study a spectrum of models derived by generalizing the current state of the art for few-shot link prediction.
We find that a simple zero-shot baseline - which ignores any relation-specific information - achieves surprisingly strong performance.
Experiments on carefully crafted synthetic datasets show that having only a few examples of a relation fundamentally limits models from using fine-grained structural information.
arXiv Detail & Related papers (2021-02-05T21:04:31Z) - Message Passing for Hyper-Relational Knowledge Graphs [7.733963597282456]
We propose a message passing graph encoder - StarE capable of modeling such hyper-relational knowledge graphs.
StarE can encode an arbitrary number of additional information (qualifiers) along with the main triple while keeping the semantic roles of qualifiers and triples intact.
Our experiments demonstrate that StarE based LP model outperforms existing approaches across multiple benchmarks.
arXiv Detail & Related papers (2020-09-22T22:38:54Z) - Type-augmented Relation Prediction in Knowledge Graphs [65.88395564516115]
We propose a type-augmented relation prediction (TaRP) method, where we apply both the type information and instance-level information for relation prediction.
Our proposed TaRP method achieves significantly better performance than state-of-the-art methods on four benchmark datasets.
arXiv Detail & Related papers (2020-09-16T21:14:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.