Abstract: Knowledge Graph (KG) completion has been excessively studied with a massive
number of models proposed for the Link Prediction (LP) task. The main
limitation of such models is their insensitivity to time. Indeed, the temporal
aspect of stored facts is often ignored. To this end, more and more works
consider time as a parameter to complete KGs. In this paper, we first
demonstrate that, by simply increasing the number of negative samples, the
recent AttH model can achieve competitive or even better performance than the
state-of-the-art on Temporal KGs (TKGs), albeit its nontemporality. We further
propose Hercules, a time-aware extension of AttH model, which defines the
curvature of a Riemannian manifold as the product of both relation and time.
Our experiments show that both Hercules and AttH achieve competitive or new
state-of-the-art performances on ICEWS04 and ICEWS05-15 datasets. Therefore,
one should raise awareness when learning TKGs representations to identify
whether time truly boosts performances.