Temporal graph models fail to capture global temporal dynamics
- URL: http://arxiv.org/abs/2309.15730v3
- Date: Fri, 8 Dec 2023 13:54:55 GMT
- Title: Temporal graph models fail to capture global temporal dynamics
- Authors: Micha{\l} Daniluk, Jacek D\k{a}browski
- Abstract summary: We propose a trivial optimization-free baseline of "recently popular nodes"
We show how standard negative sampling evaluation can be unsuitable for datasets with strong temporal dynamics.
Our results indicate that temporal graph network architectures need deep rethinking for usage in problems with significant global dynamics.
- Score: 0.43512163406552007
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A recently released Temporal Graph Benchmark is analyzed in the context of
Dynamic Link Property Prediction. We outline our observations and propose a
trivial optimization-free baseline of "recently popular nodes" outperforming
other methods on medium and large-size datasets in the Temporal Graph
Benchmark. We propose two measures based on Wasserstein distance which can
quantify the strength of short-term and long-term global dynamics of datasets.
By analyzing our unexpectedly strong baseline, we show how standard negative
sampling evaluation can be unsuitable for datasets with strong temporal
dynamics. We also show how simple negative-sampling can lead to model
degeneration during training, resulting in impossible to rank, fully saturated
predictions of temporal graph networks. We propose improved negative sampling
schemes for both training and evaluation and prove their usefulness. We conduct
a comparison with a model trained non-contrastively without negative sampling.
Our results provide a challenging baseline and indicate that temporal graph
network architectures need deep rethinking for usage in problems with
significant global dynamics, such as social media, cryptocurrency markets or
e-commerce. We open-source the code for baselines, measures and proposed
negative sampling schemes.
Related papers
- Curriculum Negative Mining For Temporal Networks [33.70909189731187]
Temporal networks are effective in capturing the evolving interactions of networks over time.
CurNM is a model-aware curriculum learning framework that adaptively adjusts the difficulty of negative samples.
Our method outperforms baseline methods by a significant margin.
arXiv Detail & Related papers (2024-07-24T07:55:49Z) - Diffusion-based Negative Sampling on Graphs for Link Prediction [8.691564173331924]
Link prediction is a fundamental task for graph analysis with important applications on the Web, such as social network analysis and recommendation systems.
We propose a novel strategy of multi-level negative sampling that enables negative node generation with flexible and controllable hardness'' levels from the latent space.
Our method, called Conditional Diffusion-based Multi-level Negative Sampling (DMNS), leverages the Markov chain property of diffusion models to generate negative nodes in multiple levels of variable hardness.
arXiv Detail & Related papers (2024-03-25T23:07:31Z) - New Perspectives on the Evaluation of Link Prediction Algorithms for
Dynamic Graphs [12.987894327817159]
We introduce novel visualization methods that can yield insight into prediction performance and the dynamics of temporal networks.
We validate empirically, on datasets extracted from recent benchmarks, that the error is typically not evenly distributed across different data segments.
arXiv Detail & Related papers (2023-11-30T11:57:07Z) - Evaluating Graph Neural Networks for Link Prediction: Current Pitfalls
and New Benchmarking [66.83273589348758]
Link prediction attempts to predict whether an unseen edge exists based on only a portion of edges of a graph.
A flurry of methods have been introduced in recent years that attempt to make use of graph neural networks (GNNs) for this task.
New and diverse datasets have also been created to better evaluate the effectiveness of these new models.
arXiv Detail & Related papers (2023-06-18T01:58:59Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - ChiroDiff: Modelling chirographic data with Diffusion Models [132.5223191478268]
We introduce a powerful model-class namely "Denoising Diffusion Probabilistic Models" or DDPMs for chirographic data.
Our model named "ChiroDiff", being non-autoregressive, learns to capture holistic concepts and therefore remains resilient to higher temporal sampling rate.
arXiv Detail & Related papers (2023-04-07T15:17:48Z) - Diving into Unified Data-Model Sparsity for Class-Imbalanced Graph
Representation Learning [30.23894624193583]
Graph Neural Networks (GNNs) training upon non-Euclidean graph data often encounters relatively higher time costs.
We develop a unified data-model dynamic sparsity framework named Graph Decantation (GraphDec) to address challenges brought by training upon a massive class-imbalanced graph data.
arXiv Detail & Related papers (2022-10-01T01:47:00Z) - Model Inversion Attacks against Graph Neural Networks [65.35955643325038]
We study model inversion attacks against Graph Neural Networks (GNNs)
In this paper, we present GraphMI to infer the private training graph data.
Our experimental results show that such defenses are not sufficiently effective and call for more advanced defenses against privacy attacks.
arXiv Detail & Related papers (2022-09-16T09:13:43Z) - Learning to Reconstruct Missing Data from Spatiotemporal Graphs with
Sparse Observations [11.486068333583216]
This paper tackles the problem of learning effective models to reconstruct missing data points.
We propose a class of attention-based architectures, that given a set of highly sparse observations, learn a representation for points in time and space.
Compared to the state of the art, our model handles sparse data without propagating prediction errors or requiring a bidirectional model to encode forward and backward time dependencies.
arXiv Detail & Related papers (2022-05-26T16:40:48Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Back2Future: Leveraging Backfill Dynamics for Improving Real-time
Predictions in Future [73.03458424369657]
In real-time forecasting in public health, data collection is a non-trivial and demanding task.
'Backfill' phenomenon and its effect on model performance has been barely studied in the prior literature.
We formulate a novel problem and neural framework Back2Future that aims to refine a given model's predictions in real-time.
arXiv Detail & Related papers (2021-06-08T14:48:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.