Inductive Link Prediction on N-ary Relational Facts via Semantic Hypergraph Reasoning
- URL: http://arxiv.org/abs/2503.20676v1
- Date: Wed, 26 Mar 2025 16:09:54 GMT
- Title: Inductive Link Prediction on N-ary Relational Facts via Semantic Hypergraph Reasoning
- Authors: Gongzhu Yin, Hongli Zhang, Yuchen Yang, Yi Luo,
- Abstract summary: We propose an n-ary subgraph reasoning framework for fully inductive link prediction (ILP) on n-ary relational facts.<n>Specifically, we introduce a novel graph structure, the n-ary semantic hypergraph, to facilitate subgraph extraction.<n>We also develop a subgraph aggregating network, NS-HART, to effectively mine complex semantic correlations within subgraphs.
- Score: 13.74104688000986
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: N-ary relational facts represent semantic correlations among more than two entities. While recent studies have developed link prediction (LP) methods to infer missing relations for knowledge graphs (KGs) containing n-ary relational facts, they are generally limited to transductive settings. Fully inductive settings, where predictions are made on previously unseen entities, remain a significant challenge. As existing methods are mainly entity embedding-based, they struggle to capture entity-independent logical rules. To fill in this gap, we propose an n-ary subgraph reasoning framework for fully inductive link prediction (ILP) on n-ary relational facts. This framework reasons over local subgraphs and has a strong inductive inference ability to capture n-ary patterns. Specifically, we introduce a novel graph structure, the n-ary semantic hypergraph, to facilitate subgraph extraction. Moreover, we develop a subgraph aggregating network, NS-HART, to effectively mine complex semantic correlations within subgraphs. Theoretically, we provide a thorough analysis from the score function optimization perspective to shed light on NS-HART's effectiveness for n-ary ILP tasks. Empirically, we conduct extensive experiments on a series of inductive benchmarks, including transfer reasoning (with and without entity features) and pairwise subgraph reasoning. The results highlight the superiority of the n-ary subgraph reasoning framework and the exceptional inductive ability of NS-HART. The source code of this paper has been made publicly available at https://github.com/yin-gz/Nary-Inductive-SubGraph.
Related papers
- Graph Stochastic Neural Process for Inductive Few-shot Knowledge Graph Completion [63.68647582680998]
We focus on a task called inductive few-shot knowledge graph completion (I-FKGC)
Inspired by the idea of inductive reasoning, we cast I-FKGC as an inductive reasoning problem.
We present a neural process-based hypothesis extractor that models the joint distribution of hypothesis, from which we can sample a hypothesis for predictions.
In the second module, based on the hypothesis, we propose a graph attention-based predictor to test if the triple in the query set aligns with the extracted hypothesis.
arXiv Detail & Related papers (2024-08-03T13:37:40Z) - EiG-Search: Generating Edge-Induced Subgraphs for GNN Explanation in Linear Time [30.44473492282072]
Most existing subgraph-level explainers face efficiency challenges in explaining Graph Neural Networks (GNNs) due to complex search processes.
In this paper, we reveal that inducing subgraph explanations by edges is more comprehensive than other subgraph inducing techniques.
We employ an efficient linear-time search algorithm over the edge-induced subgraphs, where the edges are ranked by an enhanced gradient-based importance.
arXiv Detail & Related papers (2024-05-02T21:55:12Z) - Learning Complete Topology-Aware Correlations Between Relations for Inductive Link Prediction [121.65152276851619]
We show that semantic correlations between relations are inherently edge-level and entity-independent.
We propose a novel subgraph-based method, namely TACO, to model Topology-Aware COrrelations between relations.
To further exploit the potential of RCN, we propose Complete Common Neighbor induced subgraph.
arXiv Detail & Related papers (2023-09-20T08:11:58Z) - Inductive Knowledge Graph Completion with GNNs and Rules: An Analysis [18.11743347414004]
Rule-based methods significantly outperform state-of-the-art methods based on Graph Neural Networks (GNNs)
We study a number of variants of a rule-based approach, which are specifically aimed at addressing the aforementioned issues.
We find that the resulting models can achieve a performance which is close to that of NBFNet.
arXiv Detail & Related papers (2023-08-14T21:01:29Z) - Towards Few-shot Inductive Link Prediction on Knowledge Graphs: A
Relational Anonymous Walk-guided Neural Process Approach [49.00753238429618]
Few-shot inductive link prediction on knowledge graphs aims to predict missing links for unseen entities with few-shot links observed.
Recent inductive methods utilize the sub-graphs around unseen entities to obtain the semantics and predict links inductively.
We propose a novel relational anonymous walk-guided neural process for few-shot inductive link prediction on knowledge graphs, denoted as RawNP.
arXiv Detail & Related papers (2023-06-26T12:02:32Z) - Improving Expressivity of GNNs with Subgraph-specific Factor Embedded
Normalization [30.86182962089487]
Graph Neural Networks (GNNs) have emerged as a powerful category of learning architecture for handling graph-structured data.
We propose a dedicated plug-and-play normalization scheme, termed as SUbgraph-sPEcific FactoR Embedded Normalization (SuperNorm)
arXiv Detail & Related papers (2023-05-31T14:37:31Z) - Few-shot Link Prediction on N-ary Facts [70.8150181683017]
Link Prediction on Hyper-relational Facts (LPHFs) is to predict a missing element in a hyper-relational fact.
Few-Shot Link Prediction on Hyper-relational Facts (PHFs) aims to predict a missing entity in a hyper-relational fact with limited support instances.
arXiv Detail & Related papers (2023-05-10T12:44:00Z) - Subgraph Neighboring Relations Infomax for Inductive Link Prediction on
Knowledge Graphs [4.096203468876652]
Subgraph Neighboring Relations Infomax, SNRI, exploits complete neighboring relations from two aspects.
Experiments show SNRI outperforms existing state-of-art methods by a large margin on inductive link prediction task.
arXiv Detail & Related papers (2022-07-28T01:52:39Z) - FactGraph: Evaluating Factuality in Summarization with Semantic Graph
Representations [114.94628499698096]
We propose FactGraph, a method that decomposes the document and the summary into structured meaning representations (MRs)
MRs describe core semantic concepts and their relations, aggregating the main content in both document and summary in a canonical form, and reducing data sparsity.
Experiments on different benchmarks for evaluating factuality show that FactGraph outperforms previous approaches by up to 15%.
arXiv Detail & Related papers (2022-04-13T16:45:33Z) - Deconfounding to Explanation Evaluation in Graph Neural Networks [136.73451468551656]
We argue that a distribution shift exists between the full graph and the subgraph, causing the out-of-distribution problem.
We propose Deconfounded Subgraph Evaluation (DSE) which assesses the causal effect of an explanatory subgraph on the model prediction.
arXiv Detail & Related papers (2022-01-21T18:05:00Z) - Generalizing Graph Neural Networks on Out-Of-Distribution Graphs [51.33152272781324]
Graph Neural Networks (GNNs) are proposed without considering the distribution shifts between training and testing graphs.
In such a setting, GNNs tend to exploit subtle statistical correlations existing in the training set for predictions, even though it is a spurious correlation.
We propose a general causal representation framework, called StableGNN, to eliminate the impact of spurious correlations.
arXiv Detail & Related papers (2021-11-20T18:57:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.