Spatio-Temporal Graph Contrastive Learning
- URL: http://arxiv.org/abs/2108.11873v1
- Date: Thu, 26 Aug 2021 16:05:32 GMT
- Title: Spatio-Temporal Graph Contrastive Learning
- Authors: Xu Liu, Yuxuan Liang, Yu Zheng, Bryan Hooi, Roger Zimmermann
- Abstract summary: We propose a Spatio-Temporal Graph Contrastive Learning framework (STGCL) to tackle these issues.
We elaborate on four types of data augmentations which disturb data in terms of graph structure, time domain, and frequency domain.
Our framework is evaluated across three real-world datasets and four state-of-the-art models.
- Score: 49.132528449909316
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning models are modern tools for spatio-temporal graph (STG)
forecasting. Despite their effectiveness, they require large-scale datasets to
achieve better performance and are vulnerable to noise perturbation. To
alleviate these limitations, an intuitive idea is to use the popular data
augmentation and contrastive learning techniques. However, existing graph
contrastive learning methods cannot be directly applied to STG forecasting due
to three reasons. First, we empirically discover that the forecasting task is
unable to benefit from the pretrained representations derived from contrastive
learning. Second, data augmentations that are used for defeating noise are less
explored for STG data. Third, the semantic similarity of samples has been
overlooked. In this paper, we propose a Spatio-Temporal Graph Contrastive
Learning framework (STGCL) to tackle these issues. Specifically, we improve the
performance by integrating the forecasting loss with an auxiliary contrastive
loss rather than using a pretrained paradigm. We elaborate on four types of
data augmentations, which disturb data in terms of graph structure, time
domain, and frequency domain. We also extend the classic contrastive loss
through a rule-based strategy that filters out the most semantically similar
negatives. Our framework is evaluated across three real-world datasets and four
state-of-the-art models. The consistent improvements demonstrate that STGCL can
be used as an off-the-shelf plug-in for existing deep models.
Related papers
- History repeats Itself: A Baseline for Temporal Knowledge Graph Forecasting [10.396081172890025]
Temporal Knowledge Graph (TKG) Forecasting aims at predicting links in Knowledge Graphs for future timesteps based on a history of Knowledge Graphs.
We propose to design an intuitive baseline for TKG Forecasting based on predicting recurring facts.
arXiv Detail & Related papers (2024-04-25T16:39:32Z) - Learning with Noisy Foundation Models [95.50968225050012]
This paper is the first work to comprehensively understand and analyze the nature of noise in pre-training datasets.
We propose a tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise and improve generalization.
arXiv Detail & Related papers (2024-03-11T16:22:41Z) - Time-aware Graph Structure Learning via Sequence Prediction on Temporal
Graphs [10.034072706245544]
We propose a Time-aware Graph Structure Learning (TGSL) approach via sequence prediction on temporal graphs.
In particular, it predicts time-aware context embedding and uses the Gumble-Top-K to select the closest candidate edges to this context embedding.
Experiments on temporal link prediction benchmarks demonstrate that TGSL yields significant gains for the popular TGNs such as TGAT and GraphMixer.
arXiv Detail & Related papers (2023-06-13T11:34:36Z) - LightGCL: Simple Yet Effective Graph Contrastive Learning for
Recommendation [9.181689366185038]
Graph neural clustering network (GNN) is a powerful learning approach for graph-based recommender systems.
In this paper, we propose a simple yet effective graph contrastive learning paradigm LightGCL.
arXiv Detail & Related papers (2023-02-16T10:16:21Z) - DyG2Vec: Efficient Representation Learning for Dynamic Graphs [26.792732615703372]
Temporal graph neural networks have shown promising results in learning inductive representations by automatically extracting temporal patterns.
We present an efficient yet effective attention-based encoder that leverages temporal edge encodings and window-based subgraph sampling to generate task-agnostic embeddings.
arXiv Detail & Related papers (2022-10-30T18:13:04Z) - ARIEL: Adversarial Graph Contrastive Learning [51.14695794459399]
ARIEL consistently outperforms the current graph contrastive learning methods for both node-level and graph-level classification tasks.
ARIEL is more robust in the face of adversarial attacks.
arXiv Detail & Related papers (2022-08-15T01:24:42Z) - Adversarial Graph Contrastive Learning with Information Regularization [51.14695794459399]
Contrastive learning is an effective method in graph representation learning.
Data augmentation on graphs is far less intuitive and much harder to provide high-quality contrastive samples.
We propose a simple but effective method, Adversarial Graph Contrastive Learning (ARIEL)
It consistently outperforms the current graph contrastive learning methods in the node classification task over various real-world datasets.
arXiv Detail & Related papers (2022-02-14T05:54:48Z) - Revisiting Contrastive Methods for Unsupervised Learning of Visual
Representations [78.12377360145078]
Contrastive self-supervised learning has outperformed supervised pretraining on many downstream tasks like segmentation and object detection.
In this paper, we first study how biases in the dataset affect existing methods.
We show that current contrastive approaches work surprisingly well across: (i) object- versus scene-centric, (ii) uniform versus long-tailed and (iii) general versus domain-specific datasets.
arXiv Detail & Related papers (2021-06-10T17:59:13Z) - Catastrophic Forgetting in Deep Graph Networks: an Introductory
Benchmark for Graph Classification [12.423303337249795]
We study the phenomenon of catastrophic forgetting in the graph representation learning scenario.
We find that replay is the most effective strategy in so far, which also benefits the most from the use of regularization.
arXiv Detail & Related papers (2021-03-22T12:07:21Z) - Spatio-Temporal Graph Scattering Transform [54.52797775999124]
Graph neural networks may be impractical in some real-world scenarios due to a lack of sufficient high-quality training data.
We put forth a novel mathematically designed framework to analyze-temporal data.
arXiv Detail & Related papers (2020-12-06T19:49:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.