Temporal Graph Representation Learning with Adaptive Augmentation
Contrastive
- URL: http://arxiv.org/abs/2311.03897v1
- Date: Tue, 7 Nov 2023 11:21:16 GMT
- Title: Temporal Graph Representation Learning with Adaptive Augmentation
Contrastive
- Authors: Hongjiang Chen, Pengfei Jiao, Huijun Tang, Huaming Wu
- Abstract summary: Temporal graph representation learning aims to generate low-dimensional dynamic node embeddings to capture temporal information.
We propose a novel Temporal Graph representation learning with Adaptive augmentation Contrastive (TGAC) model.
Our experiments on various real networks demonstrate that the proposed model outperforms other temporal graph representation learning methods.
- Score: 12.18909612212823
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Temporal graph representation learning aims to generate low-dimensional
dynamic node embeddings to capture temporal information as well as structural
and property information. Current representation learning methods for temporal
networks often focus on capturing fine-grained information, which may lead to
the model capturing random noise instead of essential semantic information.
While graph contrastive learning has shown promise in dealing with noise, it
only applies to static graphs or snapshots and may not be suitable for handling
time-dependent noise. To alleviate the above challenge, we propose a novel
Temporal Graph representation learning with Adaptive augmentation Contrastive
(TGAC) model. The adaptive augmentation on the temporal graph is made by
combining prior knowledge with temporal information, and the contrastive
objective function is constructed by defining the augmented inter-view contrast
and intra-view contrast. To complement TGAC, we propose three adaptive
augmentation strategies that modify topological features to reduce noise from
the network. Our extensive experiments on various real networks demonstrate
that the proposed model outperforms other temporal graph representation
learning methods.
Related papers
- RDGSL: Dynamic Graph Representation Learning with Structure Learning [23.00398150548281]
Temporal Graph Networks (TGNs) have shown remarkable performance in learning representation for continuous-time dynamic graphs.
However, real-world dynamic graphs typically contain diverse and intricate noise.
Noise can significantly degrade the quality of representation generation, impeding the effectiveness of TGNs in downstream tasks.
arXiv Detail & Related papers (2023-09-05T08:03:59Z) - Spatial-Temporal Graph Learning with Adversarial Contrastive Adaptation [19.419836274690816]
We propose a new spatial-temporal graph learning model (GraphST) for enabling effective self-supervised learning.
Our proposed model is an adversarial contrastive learning paradigm that automates the distillation of crucial multi-view self-supervised information.
We demonstrate the superiority of our proposed GraphST method in various spatial-temporal prediction tasks on real-life datasets.
arXiv Detail & Related papers (2023-06-19T03:09:35Z) - TodyNet: Temporal Dynamic Graph Neural Network for Multivariate Time
Series Classification [6.76723360505692]
We propose a novel temporal dynamic neural graph network (TodyNet) that can extract hidden-temporal dependencies without undefined graph structure.
The experiments on 26 UEA benchmark datasets illustrate that the proposed TodyNet outperforms existing deep learning-based methods in the MTSC tasks.
arXiv Detail & Related papers (2023-04-11T09:21:28Z) - EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning [92.71579608528907]
This paper aims to design an easy-to-use pipeline (termed as EasyDGL) composed of three key modules with both strong ability fitting and interpretability.
EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.
arXiv Detail & Related papers (2023-03-22T06:35:08Z) - Self-Supervised Temporal Graph learning with Temporal and Structural Intensity Alignment [53.72873672076391]
Temporal graph learning aims to generate high-quality representations for graph-based tasks with dynamic information.
We propose a self-supervised method called S2T for temporal graph learning, which extracts both temporal and structural information.
S2T achieves at most 10.13% performance improvement compared with the state-of-the-art competitors on several datasets.
arXiv Detail & Related papers (2023-02-15T06:36:04Z) - DyTed: Disentangled Representation Learning for Discrete-time Dynamic
Graph [59.583555454424]
We propose a novel disenTangled representation learning framework for discrete-time Dynamic graphs, namely DyTed.
We specially design a temporal-clips contrastive learning task together with a structure contrastive learning to effectively identify the time-invariant and time-varying representations respectively.
arXiv Detail & Related papers (2022-10-19T14:34:12Z) - Adversarial Graph Contrastive Learning with Information Regularization [51.14695794459399]
Contrastive learning is an effective method in graph representation learning.
Data augmentation on graphs is far less intuitive and much harder to provide high-quality contrastive samples.
We propose a simple but effective method, Adversarial Graph Contrastive Learning (ARIEL)
It consistently outperforms the current graph contrastive learning methods in the node classification task over various real-world datasets.
arXiv Detail & Related papers (2022-02-14T05:54:48Z) - Self-Supervised Dynamic Graph Representation Learning via Temporal
Subgraph Contrast [0.8379286663107846]
This paper proposes a self-supervised dynamic graph representation learning framework (DySubC)
DySubC defines a temporal subgraph contrastive learning task to simultaneously learn the structural and evolutional features of a dynamic graph.
Experiments on five real-world datasets demonstrate that DySubC performs better than the related baselines.
arXiv Detail & Related papers (2021-12-16T09:35:34Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - Temporal Contrastive Graph Learning for Video Action Recognition and
Retrieval [83.56444443849679]
This work takes advantage of the temporal dependencies within videos and proposes a novel self-supervised method named Temporal Contrastive Graph Learning (TCGL)
Our TCGL roots in a hybrid graph contrastive learning strategy to jointly regard the inter-snippet and intra-snippet temporal dependencies as self-supervision signals for temporal representation learning.
Experimental results demonstrate the superiority of our TCGL over the state-of-the-art methods on large-scale action recognition and video retrieval benchmarks.
arXiv Detail & Related papers (2021-01-04T08:11:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.