Learning First-Order Rules with Relational Path Contrast for Inductive
Relation Reasoning
- URL: http://arxiv.org/abs/2110.08810v1
- Date: Sun, 17 Oct 2021 12:39:01 GMT
- Title: Learning First-Order Rules with Relational Path Contrast for Inductive
Relation Reasoning
- Authors: Yudai Pan, Jun Liu, Lingling Zhang, Xin Hu, Tianzhe Zhao and Qika Lin
- Abstract summary: We propose a graph convolutional network (GCN)-based approach for interpretable inductive reasoning with relational path contrast.
RPC-IR firstly extracts relational paths between two entities and learns representations of them, and then innovatively introduces a contrastive strategy.
- Score: 8.344644072431898
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Relation reasoning in knowledge graphs (KGs) aims at predicting missing
relations in incomplete triples, whereas the dominant paradigm is learning the
embeddings of relations and entities, which is limited to a transductive
setting and has restriction on processing unseen entities in an inductive
situation. Previous inductive methods are scalable and consume less resource.
They utilize the structure of entities and triples in subgraphs to own
inductive ability. However, in order to obtain better reasoning results, the
model should acquire entity-independent relational semantics in latent rules
and solve the deficient supervision caused by scarcity of rules in subgraphs.
To address these issues, we propose a novel graph convolutional network
(GCN)-based approach for interpretable inductive reasoning with relational path
contrast, named RPC-IR. RPC-IR firstly extracts relational paths between two
entities and learns representations of them, and then innovatively introduces a
contrastive strategy by constructing positive and negative relational paths. A
joint training strategy considering both supervised and contrastive information
is also proposed. Comprehensive experiments on three inductive datasets show
that RPC-IR achieves outstanding performance comparing with the latest
inductive reasoning methods and could explicitly represent logical rules for
interpretability.
Related papers
- Phenomenal Yet Puzzling: Testing Inductive Reasoning Capabilities of Language Models with Hypothesis Refinement [92.61557711360652]
Language models (LMs) often fall short on inductive reasoning, despite achieving impressive success on research benchmarks.
We conduct a systematic study of the inductive reasoning capabilities of LMs through iterative hypothesis refinement.
We reveal several discrepancies between the inductive reasoning processes of LMs and humans, shedding light on both the potentials and limitations of using LMs in inductive reasoning tasks.
arXiv Detail & Related papers (2023-10-12T17:51:10Z) - Learning Complete Topology-Aware Correlations Between Relations for Inductive Link Prediction [121.65152276851619]
We show that semantic correlations between relations are inherently edge-level and entity-independent.
We propose a novel subgraph-based method, namely TACO, to model Topology-Aware COrrelations between relations.
To further exploit the potential of RCN, we propose Complete Common Neighbor induced subgraph.
arXiv Detail & Related papers (2023-09-20T08:11:58Z) - Inductive Relation Prediction from Relational Paths and Context with
Hierarchical Transformers [23.07740200588382]
This paper proposes a novel method that captures both connections between entities and the intrinsic nature of entities.
REPORT relies solely on relation semantics and can naturally generalize to the fully-inductive setting.
In the experiments, REPORT performs consistently better than all baselines on almost all the eight version subsets of two fully-inductive datasets.
arXiv Detail & Related papers (2023-04-01T03:49:47Z) - Multi-Aspect Explainable Inductive Relation Prediction by Sentence
Transformer [60.75757851637566]
We introduce the concepts of relation path coverage and relation path confidence to filter out unreliable paths prior to model training to elevate the model performance.
We propose Knowledge Reasoning Sentence Transformer (KRST) to predict inductive relations in knowledge graphs.
arXiv Detail & Related papers (2023-01-04T15:33:49Z) - Relation-dependent Contrastive Learning with Cluster Sampling for
Inductive Relation Prediction [30.404149577013595]
We introduce Relation-dependent Contrastive Learning (ReCoLe) for inductive relation prediction.
GNN-based encoder is optimized by contrastive learning, which ensures satisfactory performance on long-tail relations.
Experimental results suggest that ReCoLe outperforms state-of-the-art methods on commonly used inductive datasets.
arXiv Detail & Related papers (2022-11-22T13:30:49Z) - On Neural Architecture Inductive Biases for Relational Tasks [76.18938462270503]
We introduce a simple architecture based on similarity-distribution scores which we name Compositional Network generalization (CoRelNet)
We find that simple architectural choices can outperform existing models in out-of-distribution generalizations.
arXiv Detail & Related papers (2022-06-09T16:24:01Z) - HiURE: Hierarchical Exemplar Contrastive Learning for Unsupervised
Relation Extraction [60.80849503639896]
Unsupervised relation extraction aims to extract the relationship between entities from natural language sentences without prior information on relational scope or distribution.
We propose a novel contrastive learning framework named HiURE, which has the capability to derive hierarchical signals from relational feature space using cross hierarchy attention.
Experimental results on two public datasets demonstrate the advanced effectiveness and robustness of HiURE on unsupervised relation extraction when compared with state-of-the-art models.
arXiv Detail & Related papers (2022-05-04T17:56:48Z) - SAIS: Supervising and Augmenting Intermediate Steps for Document-Level
Relation Extraction [51.27558374091491]
We propose to explicitly teach the model to capture relevant contexts and entity types by supervising and augmenting intermediate steps (SAIS) for relation extraction.
Based on a broad spectrum of carefully designed tasks, our proposed SAIS method not only extracts relations of better quality due to more effective supervision, but also retrieves the corresponding supporting evidence more accurately.
arXiv Detail & Related papers (2021-09-24T17:37:35Z) - Topology-Aware Correlations Between Relations for Inductive Link
Prediction in Knowledge Graphs [41.38172189254483]
TACT is inspired by the observation that the semantic correlation between two relations is highly correlated to their topological knowledge graphs.
We categorize all relation pairs into several topological patterns then propose a structure in Correlation Network (RCN) to learn the importance of the different patterns for inductive link prediction.
Experiments demonstrate that TACT can effectively model semantic correlations between relations, and significantly outperforms existing state-of-the-art methods on benchmark datasets.
arXiv Detail & Related papers (2021-03-05T13:00:10Z) - Communicative Message Passing for Inductive Relation Reasoning [17.380798747650783]
We introduce textbfCtextbfommunicative textbfMessage textbfPassing neural network for textbfInductive retextbfLation rtextbfEasoning textbfCoMPILE.
In contrast to existing models, CoMPILE strengthens the message interactions between edges and entitles through a communicative kernel.
arXiv Detail & Related papers (2020-12-16T12:42:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.