TIE: A Framework for Embedding-based Incremental Temporal Knowledge
Graph Completion
- URL: http://arxiv.org/abs/2104.08419v1
- Date: Sat, 17 Apr 2021 01:40:46 GMT
- Title: TIE: A Framework for Embedding-based Incremental Temporal Knowledge
Graph Completion
- Authors: Jiapeng Wu, Yishi Xu, Yingxue Zhang, Chen Ma, Mark Coates and Jackie
Chi Kit Cheung
- Abstract summary: Reasoning in a temporal knowledge graph (TKG) is a critical task for information retrieval and semantic search.
Recent work approaches TKG completion (TKGC) by augmenting the encoder-decoder framework with a time-aware encoding function.
We present the Time-aware Incremental Embedding (TIE) framework, which combines TKG representation learning, experience replay, and temporal regularization.
- Score: 37.76140466390048
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Reasoning in a temporal knowledge graph (TKG) is a critical task for
information retrieval and semantic search. It is particularly challenging when
the TKG is updated frequently. The model has to adapt to changes in the TKG for
efficient training and inference while preserving its performance on historical
knowledge. Recent work approaches TKG completion (TKGC) by augmenting the
encoder-decoder framework with a time-aware encoding function. However, naively
fine-tuning the model at every time step using these methods does not address
the problems of 1) catastrophic forgetting, 2) the model's inability to
identify the change of facts (e.g., the change of the political affiliation and
end of a marriage), and 3) the lack of training efficiency. To address these
challenges, we present the Time-aware Incremental Embedding (TIE) framework,
which combines TKG representation learning, experience replay, and temporal
regularization. We introduce a set of metrics that characterizes the
intransigence of the model and propose a constraint that associates the deleted
facts with negative labels. Experimental results on Wikidata12k and YAGO11k
datasets demonstrate that the proposed TIE framework reduces training time by
about ten times and improves on the proposed metrics compared to vanilla
full-batch training. It comes without a significant loss in performance for any
traditional measures. Extensive ablation studies reveal performance trade-offs
among different evaluation metrics, which is essential for decision-making
around real-world TKG applications.
Related papers
- Adaptive Retention & Correction: Test-Time Training for Continual Learning [114.5656325514408]
A common problem in continual learning is the classification layer's bias towards the most recent task.
We name our approach Adaptive Retention & Correction (ARC)
ARC achieves an average performance increase of 2.7% and 2.6% on the CIFAR-100 and Imagenet-R datasets.
arXiv Detail & Related papers (2024-05-23T08:43:09Z) - TILP: Differentiable Learning of Temporal Logical Rules on Knowledge
Graphs [17.559644723196843]
We propose TILP, a differentiable framework for temporal logical rules learning.
We present temporal features modeling in tKG, e.g., recurrence, temporal order, interval between pair of relations, and duration, and incorporate it into our learning process.
We show that our proposed framework can improve upon the performance of baseline methods while providing interpretable results.
arXiv Detail & Related papers (2024-02-19T17:30:44Z) - Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - History Repeats: Overcoming Catastrophic Forgetting For Event-Centric
Temporal Knowledge Graph Completion [33.38304336898247]
Temporal knowledge graph (TKG) completion models rely on having access to the entire graph during training.
TKG data is often received incrementally as events unfold, leading to a dynamic non-stationary data distribution over time.
We propose a general continual training framework that is applicable to any TKG completion method.
arXiv Detail & Related papers (2023-05-30T01:21:36Z) - EvoKG: Jointly Modeling Event Time and Network Structure for Reasoning
over Temporal Knowledge Graphs [25.408246523764085]
Reasoning over temporal knowledge graphs (TKGs) is crucial for many applications to provide intelligent services.
We present a problem formulation that unifies the two major problems that need to be addressed for an effective reasoning over TKGs, namely, modeling the event time and the evolving network structure.
Our proposed method EvoKG jointly models both tasks in an effective framework, which captures the ever-changing structural and temporal dynamics in TKGs.
arXiv Detail & Related papers (2022-02-15T18:49:53Z) - Temporal Knowledge Graph Reasoning Based on Evolutional Representation
Learning [59.004025528223025]
Key to predict future facts is to thoroughly understand the historical facts.
A TKG is actually a sequence of KGs corresponding to different timestamps.
We propose a novel Recurrent Evolution network based on Graph Convolution Network (GCN)
arXiv Detail & Related papers (2021-04-21T05:12:21Z) - T-GAP: Learning to Walk across Time for Temporal Knowledge Graph
Completion [13.209193437124881]
Temporal knowledge graphs (TKGs) inherently reflect the transient nature of real-world knowledge, as opposed to static knowledge graphs.
We propose T-GAP, a novel model for TKG completion that maximally utilizes both temporal information and graph structure in its encoder and decoder.
Our experiments demonstrate that T-GAP achieves superior performance against state-of-the-art baselines, and competently generalizes to queries with unseen timestamps.
arXiv Detail & Related papers (2020-12-19T04:45:32Z) - TeMP: Temporal Message Passing for Temporal Knowledge Graph Completion [45.588053447288566]
Inferring missing facts in temporal knowledge graphs (TKGs) is a fundamental and challenging task.
We propose the Temporal Message Passing (TeMP) framework to address these challenges by combining graph neural networks, temporal dynamics models, data imputation and frequency-based gating techniques.
arXiv Detail & Related papers (2020-10-07T17:11:53Z) - AdaS: Adaptive Scheduling of Stochastic Gradients [50.80697760166045]
We introduce the notions of textit"knowledge gain" and textit"mapping condition" and propose a new algorithm called Adaptive Scheduling (AdaS)
Experimentation reveals that, using the derived metrics, AdaS exhibits: (a) faster convergence and superior generalization over existing adaptive learning methods; and (b) lack of dependence on a validation set to determine when to stop training.
arXiv Detail & Related papers (2020-06-11T16:36:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.