Incorporating Structured Sentences with Time-enhanced BERT for
Fully-inductive Temporal Relation Prediction
- URL: http://arxiv.org/abs/2304.04717v1
- Date: Mon, 10 Apr 2023 17:22:15 GMT
- Title: Incorporating Structured Sentences with Time-enhanced BERT for
Fully-inductive Temporal Relation Prediction
- Authors: Zhongwu Chen, Chengjin Xu, Fenglong Su, Zhen Huang, Yong Dou
- Abstract summary: incorporating temporal relation prediction in incomplete temporal knowledge (TKGs) is a popular temporal knowledge graph completion (TKGC) problem.
Traditional embedding-based TKGC models rely on structured connections and can only handle a fixed set of entities.
In this work, we extend the fully-inductive setting, where entities in the training and test sets are totally disjoint, into TKGs.
Our model can obtain the entity history and implicitly learn rules in the semantic space by encoding structured sentences.
- Score: 13.070974291417318
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal relation prediction in incomplete temporal knowledge graphs (TKGs)
is a popular temporal knowledge graph completion (TKGC) problem in both
transductive and inductive settings. Traditional embedding-based TKGC models
(TKGE) rely on structured connections and can only handle a fixed set of
entities, i.e., the transductive setting. In the inductive setting where test
TKGs contain emerging entities, the latest methods are based on symbolic rules
or pre-trained language models (PLMs). However, they suffer from being
inflexible and not time-specific, respectively. In this work, we extend the
fully-inductive setting, where entities in the training and test sets are
totally disjoint, into TKGs and take a further step towards a more flexible and
time-sensitive temporal relation prediction approach SST-BERT, incorporating
Structured Sentences with Time-enhanced BERT. Our model can obtain the entity
history and implicitly learn rules in the semantic space by encoding structured
sentences, solving the problem of inflexibility. We propose to use a time
masking MLM task to pre-train BERT in a corpus rich in temporal tokens
specially generated for TKGs, enhancing the time sensitivity of SST-BERT. To
compute the probability of occurrence of a target quadruple, we aggregate all
its structured sentences from both temporal and semantic perspectives into a
score. Experiments on the transductive datasets and newly generated
fully-inductive benchmarks show that SST-BERT successfully improves over
state-of-the-art baselines.
Related papers
- Large Language Models-guided Dynamic Adaptation for Temporal Knowledge Graph Reasoning [87.10396098919013]
Large Language Models (LLMs) have demonstrated extensive knowledge and remarkable proficiency in temporal reasoning.
We propose a Large Language Models-guided Dynamic Adaptation (LLM-DA) method for reasoning on Temporal Knowledge Graphs.
LLM-DA harnesses the capabilities of LLMs to analyze historical data and extract temporal logical rules.
arXiv Detail & Related papers (2024-05-23T04:54:37Z) - Deja vu: Contrastive Historical Modeling with Prefix-tuning for Temporal Knowledge Graph Reasoning [16.408149489677154]
ChapTER is a Contrastive historical modeling framework with prefix-tuning for TEmporal Reasoning.
We evaluate ChapTER on four transductive and three few-shot inductive TKGR benchmarks.
arXiv Detail & Related papers (2024-03-25T17:25:40Z) - GenTKG: Generative Forecasting on Temporal Knowledge Graph with Large Language Models [35.594662986581746]
Large language models (LLMs) have ignited interest in the temporal knowledge graph (tKG) domain, where conventional embedding-based and rule-based methods dominate.
We propose a novel retrieval-augmented generation framework named GenTKG combining a temporal logical rule-based retrieval strategy and few-shot parameter-efficient instruction tuning.
Experiments have shown that GenTKG outperforms conventional methods of temporal relational forecasting with low computation resources.
arXiv Detail & Related papers (2023-10-11T18:27:12Z) - Leveraging Pre-trained Language Models for Time Interval Prediction in
Text-Enhanced Temporal Knowledge Graphs [1.4916971861796382]
We propose a novel framework called TEMT that exploits the power of pre-trained language models (PLMs) for text-enhanced temporal knowledge graph completion.
Unlike previous approaches, TEMT effectively captures dependencies across different time points and enables predictions on unseen entities.
arXiv Detail & Related papers (2023-09-28T11:43:49Z) - NeuSTIP: A Novel Neuro-Symbolic Model for Link and Time Prediction in
Temporal Knowledge Graphs [13.442923127130806]
We propose a novel temporal neuro-symbolic model, NeuSTIP, that performs link prediction and time interval prediction in a temporal knowledge graph.
NeuSTIP learns temporal rules in the presence of the Allen predicates that ensure the temporal consistency between neighboring predicates.
Our empirical evaluation on two time interval based datasets suggests that our model outperforms state-of-the-art models for both link prediction and the time interval prediction task.
arXiv Detail & Related papers (2023-05-15T13:46:34Z) - Self-Supervised Temporal Graph learning with Temporal and Structural Intensity Alignment [53.72873672076391]
Temporal graph learning aims to generate high-quality representations for graph-based tasks with dynamic information.
We propose a self-supervised method called S2T for temporal graph learning, which extracts both temporal and structural information.
S2T achieves at most 10.13% performance improvement compared with the state-of-the-art competitors on several datasets.
arXiv Detail & Related papers (2023-02-15T06:36:04Z) - Meta-Learning Based Knowledge Extrapolation for Temporal Knowledge Graph [4.103806361930888]
Temporal KGs (TKGs) extend traditional Knowledge Graphs by associating static triples with timestamps forming quadruples.
We propose a Meta-Learning based Temporal Knowledge Graph Extrapolation (MTKGE) model, which is trained on link prediction tasks sampled from the existing TKGs.
We show that MTKGE consistently outperforms both the existing state-of-the-art models for knowledge graph extrapolation.
arXiv Detail & Related papers (2023-02-11T09:52:26Z) - Generic Temporal Reasoning with Differential Analysis and Explanation [61.96034987217583]
We introduce a novel task named TODAY that bridges the gap with temporal differential analysis.
TODAY evaluates whether systems can correctly understand the effect of incremental changes.
We show that TODAY's supervision style and explanation annotations can be used in joint learning.
arXiv Detail & Related papers (2022-12-20T17:40:03Z) - Temporal Knowledge Graph Reasoning Based on Evolutional Representation
Learning [59.004025528223025]
Key to predict future facts is to thoroughly understand the historical facts.
A TKG is actually a sequence of KGs corresponding to different timestamps.
We propose a novel Recurrent Evolution network based on Graph Convolution Network (GCN)
arXiv Detail & Related papers (2021-04-21T05:12:21Z) - Inductive Learning on Commonsense Knowledge Graph Completion [89.72388313527296]
Commonsense knowledge graph (CKG) is a special type of knowledge graph (CKG) where entities are composed of free-form text.
We propose to study the inductive learning setting for CKG completion where unseen entities may present at test time.
InductivE significantly outperforms state-of-the-art baselines in both standard and inductive settings on ATOMIC and ConceptNet benchmarks.
arXiv Detail & Related papers (2020-09-19T16:10:26Z) - Temporal Common Sense Acquisition with Minimal Supervision [77.8308414884754]
This work proposes a novel sequence modeling approach that exploits explicit and implicit mentions of temporal common sense.
Our method is shown to give quality predictions of various dimensions of temporal common sense.
It also produces representations of events for relevant tasks such as duration comparison, parent-child relations, event coreference and temporal QA.
arXiv Detail & Related papers (2020-05-08T22:20:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.