Bi-Link: Bridging Inductive Link Predictions from Text via Contrastive
Learning of Transformers and Prompts
- URL: http://arxiv.org/abs/2210.14463v1
- Date: Wed, 26 Oct 2022 04:31:07 GMT
- Title: Bi-Link: Bridging Inductive Link Predictions from Text via Contrastive
Learning of Transformers and Prompts
- Authors: Bohua Peng, Shihao Liang and Mobarakol Islam
- Abstract summary: We propose Bi-Link, a contrastive learning framework with probabilistic syntax prompts for link predictions.
Using grammatical knowledge of BERT, we efficiently search for relational prompts according to learnt syntactical patterns that generalize to large knowledge graphs.
In our experiments, Bi-Link outperforms recent baselines on link prediction datasets.
- Score: 2.9972063833424216
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Inductive knowledge graph completion requires models to comprehend the
underlying semantics and logic patterns of relations. With the advance of
pretrained language models, recent research have designed transformers for link
prediction tasks. However, empirical studies show that linearizing triples
affects the learning of relational patterns, such as inversion and symmetry. In
this paper, we propose Bi-Link, a contrastive learning framework with
probabilistic syntax prompts for link predictions. Using grammatical knowledge
of BERT, we efficiently search for relational prompts according to learnt
syntactical patterns that generalize to large knowledge graphs. To better
express symmetric relations, we design a symmetric link prediction model,
establishing bidirectional linking between forward prediction and backward
prediction. This bidirectional linking accommodates flexible self-ensemble
strategies at test time. In our experiments, Bi-Link outperforms recent
baselines on link prediction datasets (WN18RR, FB15K-237, and Wikidata5M).
Furthermore, we construct Zeshel-Ind as an in-domain inductive entity linking
the environment to evaluate Bi-Link. The experimental results demonstrate that
our method yields robust representations which can generalize under domain
shift.
Related papers
- Learning Complete Topology-Aware Correlations Between Relations for Inductive Link Prediction [121.65152276851619]
We show that semantic correlations between relations are inherently edge-level and entity-independent.
We propose a novel subgraph-based method, namely TACO, to model Topology-Aware COrrelations between relations.
To further exploit the potential of RCN, we propose Complete Common Neighbor induced subgraph.
arXiv Detail & Related papers (2023-09-20T08:11:58Z) - Linearity of Relation Decoding in Transformer Language Models [82.47019600662874]
Much of the knowledge encoded in transformer language models (LMs) may be expressed in terms of relations.
We show that, for a subset of relations, this computation is well-approximated by a single linear transformation on the subject representation.
arXiv Detail & Related papers (2023-08-17T17:59:19Z) - Towards Few-shot Inductive Link Prediction on Knowledge Graphs: A
Relational Anonymous Walk-guided Neural Process Approach [49.00753238429618]
Few-shot inductive link prediction on knowledge graphs aims to predict missing links for unseen entities with few-shot links observed.
Recent inductive methods utilize the sub-graphs around unseen entities to obtain the semantics and predict links inductively.
We propose a novel relational anonymous walk-guided neural process for few-shot inductive link prediction on knowledge graphs, denoted as RawNP.
arXiv Detail & Related papers (2023-06-26T12:02:32Z) - Link Prediction on Heterophilic Graphs via Disentangled Representation
Learning [28.770767478688008]
We study a novel problem of exploring disentangled representation learning for link prediction on heterophilic graphs.
DisenLink can learn disentangled representations by modeling the link formation and perform factor-aware message-passing to facilitate link prediction.
arXiv Detail & Related papers (2022-08-03T02:48:26Z) - A Simple yet Effective Relation Information Guided Approach for Few-Shot
Relation Extraction [22.60428265210431]
Few-Shot Relation Extraction aims at predicting the relation for a pair of entities in a sentence by training with a few labelled examples in each relation.
Some recent works have introduced relation information to assist model learning based on Prototype Network.
We argue that relation information can be introduced more explicitly and effectively into the model.
arXiv Detail & Related papers (2022-05-19T13:03:01Z) - Learning Representations of Entities and Relations [0.0]
This thesis focuses on improving knowledge graph representation with the aim of tackling the link prediction task.
The first contribution is HypER, a convolutional model which simplifies and improves upon the link prediction performance.
The second contribution is TuckER, a relatively straightforward linear model, which, at the time of its introduction, obtained state-of-the-art link prediction performance.
The third contribution is MuRP, first multi-relational graph representation model embedded in hyperbolic space.
arXiv Detail & Related papers (2022-01-31T09:24:43Z) - Link Prediction with Contextualized Self-Supervision [63.25455976593081]
Link prediction aims to infer the existence of a link between two nodes in a network.
Traditional link prediction algorithms are hindered by three major challenges -- link sparsity, node attribute noise and network dynamics.
We propose a Contextualized Self-Supervised Learning framework that fully exploits structural context prediction for link prediction.
arXiv Detail & Related papers (2022-01-25T03:12:32Z) - Link Prediction on N-ary Relational Data Based on Relatedness Evaluation [61.61555159755858]
We propose a method called NaLP to conduct link prediction on n-ary relational data.
We represent each n-ary relational fact as a set of its role and role-value pairs.
Experimental results validate the effectiveness and merits of the proposed methods.
arXiv Detail & Related papers (2021-04-21T09:06:54Z) - Topology-Aware Correlations Between Relations for Inductive Link
Prediction in Knowledge Graphs [41.38172189254483]
TACT is inspired by the observation that the semantic correlation between two relations is highly correlated to their topological knowledge graphs.
We categorize all relation pairs into several topological patterns then propose a structure in Correlation Network (RCN) to learn the importance of the different patterns for inductive link prediction.
Experiments demonstrate that TACT can effectively model semantic correlations between relations, and significantly outperforms existing state-of-the-art methods on benchmark datasets.
arXiv Detail & Related papers (2021-03-05T13:00:10Z) - Generalized Relation Learning with Semantic Correlation Awareness for
Link Prediction [29.23338194883254]
We propose a unified Generalized Relation Learning framework to address the above two problems.
After training with GRL, the closeness of semantically similar relations in vector space and the discrimination of dissimilar relations are improved.
arXiv Detail & Related papers (2020-12-22T12:22:03Z) - Learning to Extrapolate Knowledge: Transductive Few-shot Out-of-Graph
Link Prediction [69.1473775184952]
We introduce a realistic problem of few-shot out-of-graph link prediction.
We tackle this problem with a novel transductive meta-learning framework.
We validate our model on multiple benchmark datasets for knowledge graph completion and drug-drug interaction prediction.
arXiv Detail & Related papers (2020-06-11T17:42:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.