Learning from History: Modeling Temporal Knowledge Graphs with
Sequential Copy-Generation Networks
- URL: http://arxiv.org/abs/2012.08492v2
- Date: Fri, 5 Mar 2021 10:03:52 GMT
- Title: Learning from History: Modeling Temporal Knowledge Graphs with
Sequential Copy-Generation Networks
- Authors: Cunchao Zhu, Muhao Chen, Changjun Fan, Guangquan Cheng, Yan Zhan
- Abstract summary: We propose a new representation learning model for temporal knowledge graphs, namely CyGNet, based on a novel timeaware copy-generation mechanism.
CyGNet is able to predict future facts from the whole entity vocabulary, but also capable of identifying facts with repetition and accordingly predicting such future facts with reference to the known facts in the past.
- Score: 8.317441990017924
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Large knowledge graphs often grow to store temporal facts that model the
dynamic relations or interactions of entities along the timeline. Since such
temporal knowledge graphs often suffer from incompleteness, it is important to
develop time-aware representation learning models that help to infer the
missing temporal facts. While the temporal facts are typically evolving, it is
observed that many facts often show a repeated pattern along the timeline, such
as economic crises and diplomatic activities. This observation indicates that a
model could potentially learn much from the known facts appeared in history. To
this end, we propose a new representation learning model for temporal knowledge
graphs, namely CyGNet, based on a novel timeaware copy-generation mechanism.
CyGNet is not only able to predict future facts from the whole entity
vocabulary, but also capable of identifying facts with repetition and
accordingly predicting such future facts with reference to the known facts in
the past. We evaluate the proposed method on the knowledge graph completion
task using five benchmark datasets. Extensive experiments demonstrate the
effectiveness of CyGNet for predicting future facts with repetition as well as
de novo fact prediction.
Related papers
- HIP Network: Historical Information Passing Network for Extrapolation
Reasoning on Temporal Knowledge Graph [14.832067253514213]
We propose the Historical Information Passing (HIP) network to predict future events.
Our method considers the updating of relation representations and adopts three scoring functions corresponding to the above dimensions.
Experimental results on five benchmark datasets show the superiority of HIP network.
arXiv Detail & Related papers (2024-02-19T11:50:30Z) - Temporal Inductive Path Neural Network for Temporal Knowledge Graph
Reasoning [16.984588879938947]
Reasoning on Temporal Knowledge Graph (TKG) aims to predict future facts based on historical occurrences.
Most existing approaches model TKGs relying on entity modeling, as nodes in the graph play a crucial role in knowledge representation.
We propose Temporal Inductive Path Neural Network (TiPNN), which models historical information in an entity-independent perspective.
arXiv Detail & Related papers (2023-09-06T17:37:40Z) - Exploring the Limits of Historical Information for Temporal Knowledge
Graph Extrapolation [59.417443739208146]
We propose a new event forecasting model based on a novel training framework of historical contrastive learning.
CENET learns both the historical and non-historical dependency to distinguish the most potential entities.
We evaluate our proposed model on five benchmark graphs.
arXiv Detail & Related papers (2023-08-29T03:26:38Z) - Mitigating Temporal Misalignment by Discarding Outdated Facts [58.620269228776294]
Large language models are often used under temporal misalignment, tasked with answering questions about the present.
We propose fact duration prediction: the task of predicting how long a given fact will remain true.
Our data and code are released publicly at https://github.com/mikejqzhang/mitigating_misalignment.
arXiv Detail & Related papers (2023-05-24T07:30:08Z) - Complex Evolutional Pattern Learning for Temporal Knowledge Graph
Reasoning [60.94357727688448]
TKG reasoning aims to predict potential facts in the future given the historical KG sequences.
The evolutional patterns are complex in two aspects, length-diversity and time-variability.
We propose a new model, called Complex Evolutional Network (CEN), which uses a length-aware Convolutional Neural Network (CNN) to handle evolutional patterns of different lengths.
arXiv Detail & Related papers (2022-03-15T11:02:55Z) - Search from History and Reason for Future: Two-stage Reasoning on
Temporal Knowledge Graphs [56.33651635705633]
We propose CluSTeR to predict future facts in a two-stage manner, Clue Searching and Temporal Reasoning.
CluSTeR learns a beam search policy via reinforcement learning (RL) to induce multiple clues from historical facts.
At the temporal reasoning stage, it adopts a graph convolution network based sequence method to deduce answers from clues.
arXiv Detail & Related papers (2021-06-01T09:01:22Z) - Temporal Knowledge Graph Reasoning Based on Evolutional Representation
Learning [59.004025528223025]
Key to predict future facts is to thoroughly understand the historical facts.
A TKG is actually a sequence of KGs corresponding to different timestamps.
We propose a novel Recurrent Evolution network based on Graph Convolution Network (GCN)
arXiv Detail & Related papers (2021-04-21T05:12:21Z) - ChronoR: Rotation Based Temporal Knowledge Graph Embedding [8.039202293739185]
We study the challenging problem of inference over temporal knowledge graphs.
We propose Chronological Rotation embedding (ChronoR), a novel model for learning representations for entities, relations, and time.
ChronoR is able to outperform many of the state-of-the-art methods on the benchmark datasets for temporal knowledge graph link prediction.
arXiv Detail & Related papers (2021-03-18T17:08:33Z) - Tucker decomposition-based Temporal Knowledge Graph Completion [35.56360622521721]
We build a new tensor decomposition model for temporal knowledge graphs completion inspired by the Tucker decomposition of order 4 tensor.
We demonstrate that the proposed model is fully expressive and report state-of-the-art results for several public benchmarks.
arXiv Detail & Related papers (2020-11-16T07:05:52Z) - A Heterogeneous Graph with Factual, Temporal and Logical Knowledge for
Question Answering Over Dynamic Contexts [81.4757750425247]
We study question answering over a dynamic textual environment.
We develop a graph neural network over the constructed graph, and train the model in an end-to-end manner.
arXiv Detail & Related papers (2020-04-25T04:53:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.