Leveraging Pre-trained Language Models for Time Interval Prediction in
Text-Enhanced Temporal Knowledge Graphs
- URL: http://arxiv.org/abs/2309.16357v1
- Date: Thu, 28 Sep 2023 11:43:49 GMT
- Title: Leveraging Pre-trained Language Models for Time Interval Prediction in
Text-Enhanced Temporal Knowledge Graphs
- Authors: Duygu Sezen Islakoglu, Mel Chekol, Yannis Velegrakis
- Abstract summary: We propose a novel framework called TEMT that exploits the power of pre-trained language models (PLMs) for text-enhanced temporal knowledge graph completion.
Unlike previous approaches, TEMT effectively captures dependencies across different time points and enables predictions on unseen entities.
- Score: 1.4916971861796382
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Most knowledge graph completion (KGC) methods learn latent representations of
entities and relations of a given graph by mapping them into a vector space.
Although the majority of these methods focus on static knowledge graphs, a
large number of publicly available KGs contain temporal information stating the
time instant/period over which a certain fact has been true. Such graphs are
often known as temporal knowledge graphs. Furthermore, knowledge graphs may
also contain textual descriptions of entities and relations. Both temporal
information and textual descriptions are not taken into account during
representation learning by static KGC methods, and only structural information
of the graph is leveraged. Recently, some studies have used temporal
information to improve link prediction, yet they do not exploit textual
descriptions and do not support inductive inference (prediction on entities
that have not been seen in training).
We propose a novel framework called TEMT that exploits the power of
pre-trained language models (PLMs) for text-enhanced temporal knowledge graph
completion. The knowledge stored in the parameters of a PLM allows TEMT to
produce rich semantic representations of facts and to generalize on previously
unseen entities. TEMT leverages textual and temporal information available in a
KG, treats them separately, and fuses them to get plausibility scores of facts.
Unlike previous approaches, TEMT effectively captures dependencies across
different time points and enables predictions on unseen entities. To assess the
performance of TEMT, we carried out several experiments including time interval
prediction, both in transductive and inductive settings, and triple
classification. The experimental results show that TEMT is competitive with the
state-of-the-art.
Related papers
- Learning Granularity Representation for Temporal Knowledge Graph Completion [2.689675451882683]
Temporal Knowledge Graphs (TKGs) incorporate temporal information to reflect the dynamic structural knowledge and evolutionary patterns of real-world facts.
This paper proposes textbfLearning textbfGranularity textbfRepresentation (termed $mathsfLGRe$) for TKG completion.
It comprises two main components: Granularity Learning (GRL) and Adaptive Granularity Balancing (AGB)
arXiv Detail & Related papers (2024-08-27T08:19:34Z) - TEILP: Time Prediction over Knowledge Graphs via Logical Reasoning [14.480267340831542]
We propose TEILP, a logical reasoning framework that naturally integrates temporal elements into knowledge graph predictions.
We first convert TKGs into a temporal event knowledge graph (TEKG) which has a more explicit representation of time in term of nodes of the graph.
Finally, we introduce conditional probability density functions, associated with the logical rules involving the query interval, using which we arrive at the time prediction.
arXiv Detail & Related papers (2023-12-25T21:54:56Z) - Time-aware Graph Structure Learning via Sequence Prediction on Temporal
Graphs [10.034072706245544]
We propose a Time-aware Graph Structure Learning (TGSL) approach via sequence prediction on temporal graphs.
In particular, it predicts time-aware context embedding and uses the Gumble-Top-K to select the closest candidate edges to this context embedding.
Experiments on temporal link prediction benchmarks demonstrate that TGSL yields significant gains for the popular TGNs such as TGAT and GraphMixer.
arXiv Detail & Related papers (2023-06-13T11:34:36Z) - Schema-aware Reference as Prompt Improves Data-Efficient Knowledge Graph
Construction [57.854498238624366]
We propose a retrieval-augmented approach, which retrieves schema-aware Reference As Prompt (RAP) for data-efficient knowledge graph construction.
RAP can dynamically leverage schema and knowledge inherited from human-annotated and weak-supervised data as a prompt for each sample.
arXiv Detail & Related papers (2022-10-19T16:40:28Z) - DyTed: Disentangled Representation Learning for Discrete-time Dynamic
Graph [59.583555454424]
We propose a novel disenTangled representation learning framework for discrete-time Dynamic graphs, namely DyTed.
We specially design a temporal-clips contrastive learning task together with a structure contrastive learning to effectively identify the time-invariant and time-varying representations respectively.
arXiv Detail & Related papers (2022-10-19T14:34:12Z) - Joint Language Semantic and Structure Embedding for Knowledge Graph
Completion [66.15933600765835]
We propose to jointly embed the semantics in the natural language description of the knowledge triplets with their structure information.
Our method embeds knowledge graphs for the completion task via fine-tuning pre-trained language models.
Our experiments on a variety of knowledge graph benchmarks have demonstrated the state-of-the-art performance of our method.
arXiv Detail & Related papers (2022-09-19T02:41:02Z) - TLogic: Temporal Logical Rules for Explainable Link Forecasting on
Temporal Knowledge Graphs [13.085620598065747]
In temporal knowledge graphs, time information is integrated into the graph by equipping each edge with a timestamp or a time range.
We introduce TLogic, an explainable framework that is based on temporal logical rules extracted via temporal random walks.
arXiv Detail & Related papers (2021-12-15T10:46:35Z) - TeMP: Temporal Message Passing for Temporal Knowledge Graph Completion [45.588053447288566]
Inferring missing facts in temporal knowledge graphs (TKGs) is a fundamental and challenging task.
We propose the Temporal Message Passing (TeMP) framework to address these challenges by combining graph neural networks, temporal dynamics models, data imputation and frequency-based gating techniques.
arXiv Detail & Related papers (2020-10-07T17:11:53Z) - Exploiting Structured Knowledge in Text via Graph-Guided Representation
Learning [73.0598186896953]
We present two self-supervised tasks learning over raw text with the guidance from knowledge graphs.
Building upon entity-level masked language models, our first contribution is an entity masking scheme.
In contrast to existing paradigms, our approach uses knowledge graphs implicitly, only during pre-training.
arXiv Detail & Related papers (2020-04-29T14:22:42Z) - Generative Adversarial Zero-Shot Relational Learning for Knowledge
Graphs [96.73259297063619]
We consider a novel formulation, zero-shot learning, to free this cumbersome curation.
For newly-added relations, we attempt to learn their semantic features from their text descriptions.
We leverage Generative Adrial Networks (GANs) to establish the connection between text and knowledge graph domain.
arXiv Detail & Related papers (2020-01-08T01:19:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.