Temporal Smoothness Regularisers for Neural Link Predictors
- URL: http://arxiv.org/abs/2309.09045v2
- Date: Tue, 05 Nov 2024 15:26:50 GMT
- Title: Temporal Smoothness Regularisers for Neural Link Predictors
- Authors: Manuel Dileo, Pasquale Minervini, Matteo Zignani, Sabrina Gaito,
- Abstract summary: We show that a simple method like TNTComplEx can produce significantly more accurate results than state-of-the-art methods.
We also evaluate the impact of a wide range of temporal smoothing regularisers on two state-of-the-art temporal link prediction models.
- Score: 8.975480841443272
- License:
- Abstract: Most algorithms for representation learning and link prediction on relational data are designed for static data. However, the data to which they are applied typically evolves over time, including online social networks or interactions between users and items in recommender systems. This is also the case for graph-structured knowledge bases -- knowledge graphs -- which contain facts that are valid only for specific points in time. In such contexts, it becomes crucial to correctly identify missing links at a precise time point, i.e. the temporal prediction link task. Recently, Lacroix et al. and Sadeghian et al. proposed a solution to the problem of link prediction for knowledge graphs under temporal constraints inspired by the canonical decomposition of 4-order tensors, where they regularise the representations of time steps by enforcing temporal smoothing, i.e. by learning similar transformation for adjacent timestamps. However, the impact of the choice of temporal regularisation terms is still poorly understood. In this work, we systematically analyse several choices of temporal smoothing regularisers using linear functions and recurrent architectures. In our experiments, we show that by carefully selecting the temporal smoothing regulariser and regularisation weight, a simple method like TNTComplEx can produce significantly more accurate results than state-of-the-art methods on three widely used temporal link prediction datasets. Furthermore, we evaluate the impact of a wide range of temporal smoothing regularisers on two state-of-the-art temporal link prediction models. Our work shows that simple tensor factorisation models can produce new state-of-the-art results using newly proposed temporal regularisers, highlighting a promising avenue for future research.
Related papers
- Inference of Sequential Patterns for Neural Message Passing in Temporal Graphs [0.6562256987706128]
HYPA-DBGNN is a novel two-step approach that combines the inference of anomalous sequential patterns in time series data on graphs.
Our method leverages hypergeometric graph ensembles to identify anomalous edges within both first- and higher-order De Bruijn graphs.
Our work is the first to introduce statistically informed GNNs that leverage temporal and causal sequence anomalies.
arXiv Detail & Related papers (2024-06-24T11:41:12Z) - From Link Prediction to Forecasting: Information Loss in Batch-based Temporal Graph Learning [0.716879432974126]
We show that the suitability of common batch-oriented evaluation depends on the datasets' characteristics.
We reformulate dynamic link prediction as a link forecasting task that better accounts for temporal information present in the data.
arXiv Detail & Related papers (2024-06-07T12:45:12Z) - TempSAL -- Uncovering Temporal Information for Deep Saliency Prediction [64.63645677568384]
We introduce a novel saliency prediction model that learns to output saliency maps in sequential time intervals.
Our approach locally modulates the saliency predictions by combining the learned temporal maps.
Our code will be publicly available on GitHub.
arXiv Detail & Related papers (2023-01-05T22:10:16Z) - Direct Embedding of Temporal Network Edges via Time-Decayed Line Graphs [51.51417735550026]
Methods for machine learning on temporal networks generally exhibit at least one of two limitations.
We present a simple method that avoids both shortcomings: construct the line graph of the network, which includes a node for each interaction, and weigh the edges of this graph based on the difference in time between interactions.
Empirical results on real-world networks demonstrate our method's efficacy and efficiency on both edge classification and temporal link prediction.
arXiv Detail & Related papers (2022-09-30T18:24:13Z) - Temporal Knowledge Graph Reasoning with Low-rank and Model-agnostic
Representations [1.8262547855491458]
We introduce Time-LowFER, a family of parameter-efficient and time-aware extensions of the low-rank tensor factorization model LowFER.
Noting several limitations in current approaches to represent time, we propose a cycle-aware time-encoding scheme for time features.
We implement our methods in a unified temporal knowledge graph embedding framework, focusing on time-sensitive data processing.
arXiv Detail & Related papers (2022-04-10T22:24:11Z) - TLogic: Temporal Logical Rules for Explainable Link Forecasting on
Temporal Knowledge Graphs [13.085620598065747]
In temporal knowledge graphs, time information is integrated into the graph by equipping each edge with a timestamp or a time range.
We introduce TLogic, an explainable framework that is based on temporal logical rules extracted via temporal random walks.
arXiv Detail & Related papers (2021-12-15T10:46:35Z) - ChronoR: Rotation Based Temporal Knowledge Graph Embedding [8.039202293739185]
We study the challenging problem of inference over temporal knowledge graphs.
We propose Chronological Rotation embedding (ChronoR), a novel model for learning representations for entities, relations, and time.
ChronoR is able to outperform many of the state-of-the-art methods on the benchmark datasets for temporal knowledge graph link prediction.
arXiv Detail & Related papers (2021-03-18T17:08:33Z) - One-shot Learning for Temporal Knowledge Graphs [49.41854171118697]
We propose a one-shot learning framework for link prediction in temporal knowledge graphs.
Our proposed method employs a self-attention mechanism to effectively encode temporal interactions between entities.
Our experiments show that the proposed algorithm outperforms the state of the art baselines for two well-studied benchmarks.
arXiv Detail & Related papers (2020-10-23T03:24:44Z) - From Static to Dynamic Node Embeddings [61.58641072424504]
We introduce a general framework for leveraging graph stream data for temporal prediction-based applications.
Our proposed framework includes novel methods for learning an appropriate graph time-series representation.
We find that the top-3 temporal models are always those that leverage the new $epsilon$-graph time-series representation.
arXiv Detail & Related papers (2020-09-21T16:48:29Z) - Predicting Temporal Sets with Deep Neural Networks [50.53727580527024]
We propose an integrated solution based on the deep neural networks for temporal sets prediction.
A unique perspective is to learn element relationship by constructing set-level co-occurrence graph.
We design an attention-based module to adaptively learn the temporal dependency of elements and sets.
arXiv Detail & Related papers (2020-06-20T03:29:02Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.