Complex Evolutional Pattern Learning for Temporal Knowledge Graph
Reasoning
- URL: http://arxiv.org/abs/2203.07782v1
- Date: Tue, 15 Mar 2022 11:02:55 GMT
- Title: Complex Evolutional Pattern Learning for Temporal Knowledge Graph
Reasoning
- Authors: Zixuan Li, Saiping Guan, Xiaolong Jin, Weihua Peng, Yajuan Lyu, Yong
Zhu, Long Bai, Wei Li, Jiafeng Guo and Xueqi Cheng
- Abstract summary: TKG reasoning aims to predict potential facts in the future given the historical KG sequences.
The evolutional patterns are complex in two aspects, length-diversity and time-variability.
We propose a new model, called Complex Evolutional Network (CEN), which uses a length-aware Convolutional Neural Network (CNN) to handle evolutional patterns of different lengths.
- Score: 60.94357727688448
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A Temporal Knowledge Graph (TKG) is a sequence of KGs corresponding to
different timestamps. TKG reasoning aims to predict potential facts in the
future given the historical KG sequences. One key of this task is to mine and
understand evolutional patterns of facts from these sequences. The evolutional
patterns are complex in two aspects, length-diversity and time-variability.
Existing models for TKG reasoning focus on modeling fact sequences of a fixed
length, which cannot discover complex evolutional patterns that vary in length.
Furthermore, these models are all trained offline, which cannot well adapt to
the changes of evolutional patterns from then on. Thus, we propose a new model,
called Complex Evolutional Network (CEN), which uses a length-aware
Convolutional Neural Network (CNN) to handle evolutional patterns of different
lengths via an easy-to-difficult curriculum learning strategy. Besides, we
propose to learn the model under the online setting so that it can adapt to the
changes of evolutional patterns over time. Extensive experiments demonstrate
that CEN obtains substantial performance improvement under both the traditional
offline and the proposed online settings.
Related papers
- Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - Understanding Patterns of Deep Learning ModelEvolution in Network
Architecture Search [0.8124699127636158]
We show how the evolution of the model structure is influenced by the regularized evolution algorithm.
We describe how evolutionary patterns appear in distributed settings and opportunities for caching and improved scheduling.
arXiv Detail & Related papers (2023-09-22T02:12:47Z) - An Evolution Kernel Method for Graph Classification through Heat
Diffusion Dynamics [12.094047128690834]
We propose a heat-driven method to transform each static graph into a sequence of temporal ones.
This approach effectively describes the evolutional behaviours of the system.
It has been successfully applied to classification problems in real-world structural graph datasets.
arXiv Detail & Related papers (2023-06-26T13:32:11Z) - MetaTKG: Learning Evolutionary Meta-Knowledge for Temporal Knowledge
Graph Reasoning [23.690981770829282]
We propose a novel Temporal Meta-learning framework for TKG reasoning, MetaTKG for brevity.
Specifically, our method regards TKG prediction as many temporal meta-tasks, and utilizes the designed Temporal Meta-learner to learn evolutionary meta-knowledge from these meta-tasks.
The proposed method aims to guide the backbones to learn to adapt quickly to future data and deal with entities with little historical information by the learned meta-knowledge.
arXiv Detail & Related papers (2023-02-02T05:55:41Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Learning of Visual Relations: The Devil is in the Tails [59.737494875502215]
Visual relation learning is a long-tailed problem, due to the nature of joint reasoning about groups of objects.
In this paper, we explore an alternative hypothesis, denoted the Devil is in the Tails.
Under this hypothesis, better performance is achieved by keeping the model simple but improving its ability to cope with long-tailed distributions.
arXiv Detail & Related papers (2021-08-22T08:59:35Z) - Temporal Knowledge Graph Reasoning Based on Evolutional Representation
Learning [59.004025528223025]
Key to predict future facts is to thoroughly understand the historical facts.
A TKG is actually a sequence of KGs corresponding to different timestamps.
We propose a novel Recurrent Evolution network based on Graph Convolution Network (GCN)
arXiv Detail & Related papers (2021-04-21T05:12:21Z) - Learning Temporal Dynamics from Cycles in Narrated Video [85.89096034281694]
We propose a self-supervised solution to the problem of learning to model how the world changes as time elapses.
Our model learns modality-agnostic functions to predict forward and backward in time, which must undo each other when composed.
We apply the learned dynamics model without further training to various tasks, such as predicting future action and temporally ordering sets of images.
arXiv Detail & Related papers (2021-01-07T02:41:32Z) - Streaming Graph Neural Networks via Continual Learning [31.810308087441445]
Graph neural networks (GNNs) have achieved strong performance in various applications.
In this paper, we propose a streaming GNN model based on continual learning.
We show that our model can efficiently update model parameters and achieve comparable performance to model retraining.
arXiv Detail & Related papers (2020-09-23T06:52:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.