Arbitrary Time Information Modeling via Polynomial Approximation for Temporal Knowledge Graph Embedding
- URL: http://arxiv.org/abs/2405.00358v1
- Date: Wed, 1 May 2024 07:27:04 GMT
- Title: Arbitrary Time Information Modeling via Polynomial Approximation for Temporal Knowledge Graph Embedding
- Authors: Zhiyu Fang, Jingyan Qin, Xiaobin Zhu, Chun Yang, Xu-Cheng Yin,
- Abstract summary: temporal knowledge graphs (TKGs) must explore and reason over temporally evolving facts adequately.
Existing TKG approaches face two main challenges, i.e., the limited capability to model arbitrary timestamps continuously and the lack of rich inference patterns under temporal constraints.
We propose an innovative TKGE method (PTBox) via decomposition-based temporal representation and embedding-based entity representation.
- Score: 23.39851202825318
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Distinguished from traditional knowledge graphs (KGs), temporal knowledge graphs (TKGs) must explore and reason over temporally evolving facts adequately. However, existing TKG approaches still face two main challenges, i.e., the limited capability to model arbitrary timestamps continuously and the lack of rich inference patterns under temporal constraints. In this paper, we propose an innovative TKGE method (PTBox) via polynomial decomposition-based temporal representation and box embedding-based entity representation to tackle the above-mentioned problems. Specifically, we decompose time information by polynomials and then enhance the model's capability to represent arbitrary timestamps flexibly by incorporating the learnable temporal basis tensor. In addition, we model every entity as a hyperrectangle box and define each relation as a transformation on the head and tail entity boxes. The entity boxes can capture complex geometric structures and learn robust representations, improving the model's inductive capability for rich inference patterns. Theoretically, our PTBox can encode arbitrary time information or even unseen timestamps while capturing rich inference patterns and higher-arity relations of the knowledge base. Extensive experiments on real-world datasets demonstrate the effectiveness of our method.
Related papers
- Learning Granularity Representation for Temporal Knowledge Graph Completion [2.689675451882683]
Temporal Knowledge Graphs (TKGs) incorporate temporal information to reflect the dynamic structural knowledge and evolutionary patterns of real-world facts.
This paper proposes textbfLearning textbfGranularity textbfRepresentation (termed $mathsfLGRe$) for TKG completion.
It comprises two main components: Granularity Learning (GRL) and Adaptive Granularity Balancing (AGB)
arXiv Detail & Related papers (2024-08-27T08:19:34Z) - Simple but Effective Compound Geometric Operations for Temporal Knowledge Graph Completion [18.606006541284422]
Temporal knowledge graph completion aims to infer the missing facts in temporal knowledge graphs.
Current approaches usually embed factual knowledge into continuous vector space and apply geometric operations to learn potential patterns in temporal knowledge graphs.
We propose TCompoundE, which is specially designed with two geometric operations, including time-specific and relation-specific operations.
arXiv Detail & Related papers (2024-08-13T03:36:30Z) - Temporal Inductive Path Neural Network for Temporal Knowledge Graph
Reasoning [16.984588879938947]
Reasoning on Temporal Knowledge Graph (TKG) aims to predict future facts based on historical occurrences.
Most existing approaches model TKGs relying on entity modeling, as nodes in the graph play a crucial role in knowledge representation.
We propose Temporal Inductive Path Neural Network (TiPNN), which models historical information in an entity-independent perspective.
arXiv Detail & Related papers (2023-09-06T17:37:40Z) - ChiroDiff: Modelling chirographic data with Diffusion Models [132.5223191478268]
We introduce a powerful model-class namely "Denoising Diffusion Probabilistic Models" or DDPMs for chirographic data.
Our model named "ChiroDiff", being non-autoregressive, learns to capture holistic concepts and therefore remains resilient to higher temporal sampling rate.
arXiv Detail & Related papers (2023-04-07T15:17:48Z) - Temporal Knowledge Graph Reasoning with Low-rank and Model-agnostic
Representations [1.8262547855491458]
We introduce Time-LowFER, a family of parameter-efficient and time-aware extensions of the low-rank tensor factorization model LowFER.
Noting several limitations in current approaches to represent time, we propose a cycle-aware time-encoding scheme for time features.
We implement our methods in a unified temporal knowledge graph embedding framework, focusing on time-sensitive data processing.
arXiv Detail & Related papers (2022-04-10T22:24:11Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - TCGL: Temporal Contrastive Graph for Self-supervised Video
Representation Learning [79.77010271213695]
We propose a novel video self-supervised learning framework named Temporal Contrastive Graph Learning (TCGL)
Our TCGL integrates the prior knowledge about the frame and snippet orders into graph structures, i.e., the intra-/inter- snippet Temporal Contrastive Graphs (TCG)
To generate supervisory signals for unlabeled videos, we introduce an Adaptive Snippet Order Prediction (ASOP) module.
arXiv Detail & Related papers (2021-12-07T09:27:56Z) - Temporal Knowledge Graph Completion using Box Embeddings [13.858051019755283]
BoxTE is a box embedding model for temporal knowledge graph completion.
We show that BoxTE is fully expressive, and possesses strong inductive capacity in the temporal setting.
We then empirically evaluate our model and show that it achieves state-of-the-art results on several TKGC benchmarks.
arXiv Detail & Related papers (2021-09-18T17:12:13Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Temporal Knowledge Graph Reasoning Based on Evolutional Representation
Learning [59.004025528223025]
Key to predict future facts is to thoroughly understand the historical facts.
A TKG is actually a sequence of KGs corresponding to different timestamps.
We propose a novel Recurrent Evolution network based on Graph Convolution Network (GCN)
arXiv Detail & Related papers (2021-04-21T05:12:21Z) - On the Post-hoc Explainability of Deep Echo State Networks for Time
Series Forecasting, Image and Video Classification [63.716247731036745]
echo state networks have attracted many stares through time, mainly due to the simplicity and computational efficiency of their learning algorithm.
This work addresses this issue by conducting an explainability study of Echo State Networks when applied to learning tasks with time series, image and video data.
Specifically, the study proposes three different techniques capable of eliciting understandable information about the knowledge grasped by these recurrent models.
arXiv Detail & Related papers (2021-02-17T08:56:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.