TimeTraveler: Reinforcement Learning for Temporal Knowledge Graph
Forecasting
- URL: http://arxiv.org/abs/2109.04101v1
- Date: Thu, 9 Sep 2021 08:41:01 GMT
- Title: TimeTraveler: Reinforcement Learning for Temporal Knowledge Graph
Forecasting
- Authors: Haohai Sun, Jialun Zhong, Yunpu Ma, Zhen Han and Kun He
- Abstract summary: We propose the first reinforcement learning method for forecasting. Specifically, the agent travels on historical knowledge graph snapshots to search for the answer.
Our method defines a relative time encoding function to capture the timespan information, and we design a novel time-shaped reward based on Dirichlet distribution to guide the model learning.
We evaluate our method for this link prediction task at future timestamps.
- Score: 12.963769928056253
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal knowledge graph (TKG) reasoning is a crucial task that has gained
increasing research interest in recent years. Most existing methods focus on
reasoning at past timestamps to complete the missing facts, and there are only
a few works of reasoning on known TKGs to forecast future facts. Compared with
the completion task, the forecasting task is more difficult that faces two main
challenges: (1) how to effectively model the time information to handle future
timestamps? (2) how to make inductive inference to handle previously unseen
entities that emerge over time? To address these challenges, we propose the
first reinforcement learning method for forecasting. Specifically, the agent
travels on historical knowledge graph snapshots to search for the answer. Our
method defines a relative time encoding function to capture the timespan
information, and we design a novel time-shaped reward based on Dirichlet
distribution to guide the model learning. Furthermore, we propose a novel
representation method for unseen entities to improve the inductive inference
ability of the model. We evaluate our method for this link prediction task at
future timestamps. Extensive experiments on four benchmark datasets demonstrate
substantial performance improvement meanwhile with higher explainability, less
calculation, and fewer parameters when compared with existing state-of-the-art
methods.
Related papers
- History repeats Itself: A Baseline for Temporal Knowledge Graph Forecasting [10.396081172890025]
Temporal Knowledge Graph (TKG) Forecasting aims at predicting links in Knowledge Graphs for future timesteps based on a history of Knowledge Graphs.
We propose to design an intuitive baseline for TKG Forecasting based on predicting recurring facts.
arXiv Detail & Related papers (2024-04-25T16:39:32Z) - TEILP: Time Prediction over Knowledge Graphs via Logical Reasoning [14.480267340831542]
We propose TEILP, a logical reasoning framework that naturally integrates temporal elements into knowledge graph predictions.
We first convert TKGs into a temporal event knowledge graph (TEKG) which has a more explicit representation of time in term of nodes of the graph.
Finally, we introduce conditional probability density functions, associated with the logical rules involving the query interval, using which we arrive at the time prediction.
arXiv Detail & Related papers (2023-12-25T21:54:56Z) - Contrastive Difference Predictive Coding [79.74052624853303]
We introduce a temporal difference version of contrastive predictive coding that stitches together pieces of different time series data to decrease the amount of data required to learn predictions of future events.
We apply this representation learning method to derive an off-policy algorithm for goal-conditioned RL.
arXiv Detail & Related papers (2023-10-31T03:16:32Z) - Re-Temp: Relation-Aware Temporal Representation Learning for Temporal
Knowledge Graph Completion [11.699431017532367]
Temporal Knowledge Graph Completion (TKGC) under the extrapolation setting aims to predict the missing entity from a fact in the future.
We propose our model, Re-Temp, which leverages explicit temporal embedding as input and incorporates skip information flow after each timestamp to skip unnecessary information for prediction.
We demonstrate that our model outperforms all eight recent state-of-the-art models by a significant margin.
arXiv Detail & Related papers (2023-10-24T10:58:33Z) - Exploring the Limits of Historical Information for Temporal Knowledge
Graph Extrapolation [59.417443739208146]
We propose a new event forecasting model based on a novel training framework of historical contrastive learning.
CENET learns both the historical and non-historical dependency to distinguish the most potential entities.
We evaluate our proposed model on five benchmark graphs.
arXiv Detail & Related papers (2023-08-29T03:26:38Z) - Time Series Contrastive Learning with Information-Aware Augmentations [57.45139904366001]
A key component of contrastive learning is to select appropriate augmentations imposing some priors to construct feasible positive samples.
How to find the desired augmentations of time series data that are meaningful for given contrastive learning tasks and datasets remains an open question.
We propose a new contrastive learning approach with information-aware augmentations, InfoTS, that adaptively selects optimal augmentations for time series representation learning.
arXiv Detail & Related papers (2023-03-21T15:02:50Z) - Generic Temporal Reasoning with Differential Analysis and Explanation [61.96034987217583]
We introduce a novel task named TODAY that bridges the gap with temporal differential analysis.
TODAY evaluates whether systems can correctly understand the effect of incremental changes.
We show that TODAY's supervision style and explanation annotations can be used in joint learning.
arXiv Detail & Related papers (2022-12-20T17:40:03Z) - Temporal Knowledge Graph Reasoning with Low-rank and Model-agnostic
Representations [1.8262547855491458]
We introduce Time-LowFER, a family of parameter-efficient and time-aware extensions of the low-rank tensor factorization model LowFER.
Noting several limitations in current approaches to represent time, we propose a cycle-aware time-encoding scheme for time features.
We implement our methods in a unified temporal knowledge graph embedding framework, focusing on time-sensitive data processing.
arXiv Detail & Related papers (2022-04-10T22:24:11Z) - Search from History and Reason for Future: Two-stage Reasoning on
Temporal Knowledge Graphs [56.33651635705633]
We propose CluSTeR to predict future facts in a two-stage manner, Clue Searching and Temporal Reasoning.
CluSTeR learns a beam search policy via reinforcement learning (RL) to induce multiple clues from historical facts.
At the temporal reasoning stage, it adopts a graph convolution network based sequence method to deduce answers from clues.
arXiv Detail & Related papers (2021-06-01T09:01:22Z) - Tucker decomposition-based Temporal Knowledge Graph Completion [35.56360622521721]
We build a new tensor decomposition model for temporal knowledge graphs completion inspired by the Tucker decomposition of order 4 tensor.
We demonstrate that the proposed model is fully expressive and report state-of-the-art results for several public benchmarks.
arXiv Detail & Related papers (2020-11-16T07:05:52Z) - One-shot Learning for Temporal Knowledge Graphs [49.41854171118697]
We propose a one-shot learning framework for link prediction in temporal knowledge graphs.
Our proposed method employs a self-attention mechanism to effectively encode temporal interactions between entities.
Our experiments show that the proposed algorithm outperforms the state of the art baselines for two well-studied benchmarks.
arXiv Detail & Related papers (2020-10-23T03:24:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.