TILP: Differentiable Learning of Temporal Logical Rules on Knowledge
Graphs
- URL: http://arxiv.org/abs/2402.12309v1
- Date: Mon, 19 Feb 2024 17:30:44 GMT
- Title: TILP: Differentiable Learning of Temporal Logical Rules on Knowledge
Graphs
- Authors: Siheng Xiong, Yuan Yang, Faramarz Fekri, James Clayton Kerce
- Abstract summary: We propose TILP, a differentiable framework for temporal logical rules learning.
We present temporal features modeling in tKG, e.g., recurrence, temporal order, interval between pair of relations, and duration, and incorporate it into our learning process.
We show that our proposed framework can improve upon the performance of baseline methods while providing interpretable results.
- Score: 17.559644723196843
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Compared with static knowledge graphs, temporal knowledge graphs (tKG), which
can capture the evolution and change of information over time, are more
realistic and general. However, due to the complexity that the notion of time
introduces to the learning of the rules, an accurate graph reasoning, e.g.,
predicting new links between entities, is still a difficult problem. In this
paper, we propose TILP, a differentiable framework for temporal logical rules
learning. By designing a constrained random walk mechanism and the introduction
of temporal operators, we ensure the efficiency of our model. We present
temporal features modeling in tKG, e.g., recurrence, temporal order, interval
between pair of relations, and duration, and incorporate it into our learning
process. We compare TILP with state-of-the-art methods on two benchmark
datasets. We show that our proposed framework can improve upon the performance
of baseline methods while providing interpretable results. In particular, we
consider various scenarios in which training samples are limited, data is
biased, and the time range between training and inference are different. In all
these cases, TILP works much better than the state-of-the-art methods.
Related papers
- Simple but Effective Compound Geometric Operations for Temporal Knowledge Graph Completion [18.606006541284422]
Temporal knowledge graph completion aims to infer the missing facts in temporal knowledge graphs.
Current approaches usually embed factual knowledge into continuous vector space and apply geometric operations to learn potential patterns in temporal knowledge graphs.
We propose TCompoundE, which is specially designed with two geometric operations, including time-specific and relation-specific operations.
arXiv Detail & Related papers (2024-08-13T03:36:30Z) - Temporal Smoothness Regularisers for Neural Link Predictors [8.975480841443272]
We show that a simple method like TNTComplEx can produce significantly more accurate results than state-of-the-art methods.
We also evaluate the impact of a wide range of temporal smoothing regularisers on two state-of-the-art temporal link prediction models.
arXiv Detail & Related papers (2023-09-16T16:52:49Z) - DyTed: Disentangled Representation Learning for Discrete-time Dynamic
Graph [59.583555454424]
We propose a novel disenTangled representation learning framework for discrete-time Dynamic graphs, namely DyTed.
We specially design a temporal-clips contrastive learning task together with a structure contrastive learning to effectively identify the time-invariant and time-varying representations respectively.
arXiv Detail & Related papers (2022-10-19T14:34:12Z) - Temporal Knowledge Graph Reasoning with Low-rank and Model-agnostic
Representations [1.8262547855491458]
We introduce Time-LowFER, a family of parameter-efficient and time-aware extensions of the low-rank tensor factorization model LowFER.
Noting several limitations in current approaches to represent time, we propose a cycle-aware time-encoding scheme for time features.
We implement our methods in a unified temporal knowledge graph embedding framework, focusing on time-sensitive data processing.
arXiv Detail & Related papers (2022-04-10T22:24:11Z) - Towards Similarity-Aware Time-Series Classification [51.2400839966489]
We study time-series classification (TSC), a fundamental task of time-series data mining.
We propose Similarity-Aware Time-Series Classification (SimTSC), a framework that models similarity information with graph neural networks (GNNs)
arXiv Detail & Related papers (2022-01-05T02:14:57Z) - TLogic: Temporal Logical Rules for Explainable Link Forecasting on
Temporal Knowledge Graphs [13.085620598065747]
In temporal knowledge graphs, time information is integrated into the graph by equipping each edge with a timestamp or a time range.
We introduce TLogic, an explainable framework that is based on temporal logical rules extracted via temporal random walks.
arXiv Detail & Related papers (2021-12-15T10:46:35Z) - Contrastive learning of strong-mixing continuous-time stochastic
processes [53.82893653745542]
Contrastive learning is a family of self-supervised methods where a model is trained to solve a classification task constructed from unlabeled data.
We show that a properly constructed contrastive learning task can be used to estimate the transition kernel for small-to-mid-range intervals in the diffusion case.
arXiv Detail & Related papers (2021-03-03T23:06:47Z) - One-shot Learning for Temporal Knowledge Graphs [49.41854171118697]
We propose a one-shot learning framework for link prediction in temporal knowledge graphs.
Our proposed method employs a self-attention mechanism to effectively encode temporal interactions between entities.
Our experiments show that the proposed algorithm outperforms the state of the art baselines for two well-studied benchmarks.
arXiv Detail & Related papers (2020-10-23T03:24:44Z) - Learned Factor Graphs for Inference from Stationary Time Sequences [107.63351413549992]
We propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences.
neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence.
We present an inference algorithm based on learned stationary factor graphs, which learns to implement the sum-product scheme from labeled data.
arXiv Detail & Related papers (2020-06-05T07:06:19Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.