TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning
- URL: http://arxiv.org/abs/2105.07944v1
- Date: Mon, 17 May 2021 15:33:25 GMT
- Title: TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning
- Authors: Lu Wang, Xiaofu Chang, Shuang Li, Yunfei Chu, Hui Li, Wei Zhang,
Xiaofeng He, Le Song, Jingren Zhou, Hongxia Yang
- Abstract summary: We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
- Score: 87.38675639186405
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Dynamic graph modeling has recently attracted much attention due to its
extensive applications in many real-world scenarios, such as recommendation
systems, financial transactions, and social networks. Although many works have
been proposed for dynamic graph modeling in recent years, effective and
scalable models are yet to be developed. In this paper, we propose a novel
graph neural network approach, called TCL, which deals with the
dynamically-evolving graph in a continuous-time fashion and enables effective
dynamic node representation learning that captures both the temporal and
topology information. Technically, our model contains three novel aspects.
First, we generalize the vanilla Transformer to temporal graph learning
scenarios and design a graph-topology-aware transformer. Secondly, on top of
the proposed graph transformer, we introduce a two-stream encoder that
separately extracts representations from temporal neighborhoods associated with
the two interaction nodes and then utilizes a co-attentional transformer to
model inter-dependencies at a semantic level. Lastly, we are inspired by the
recently developed contrastive learning and propose to optimize our model by
maximizing mutual information (MI) between the predictive representations of
two future interaction nodes. Benefiting from this, our dynamic representations
can preserve high-level (or global) semantics about interactions and thus is
robust to noisy interactions. To the best of our knowledge, this is the first
attempt to apply contrastive learning to representation learning on dynamic
graphs. We evaluate our model on four benchmark datasets for interaction
prediction and experiment results demonstrate the superiority of our model.
Related papers
- Node-Time Conditional Prompt Learning In Dynamic Graphs [14.62182210205324]
We propose DYGPROMPT, a novel pre-training and prompt learning framework for dynamic graph modeling.
We recognize that node and time features mutually characterize each other, and propose dual condition-nets to model the evolving node-time patterns in downstream tasks.
arXiv Detail & Related papers (2024-05-22T19:10:24Z) - Signed Graph Neural Ordinary Differential Equation for Modeling
Continuous-time Dynamics [13.912268915939656]
The prevailing approach of integrating graph neural networks with ordinary differential equations has demonstrated promising performance.
We introduce a novel approach: a signed graph neural ordinary differential equation, adeptly addressing the limitations of miscapturing signed information.
Our proposed solution boasts both flexibility and efficiency.
arXiv Detail & Related papers (2023-12-18T13:45:33Z) - Dynamic Graph Representation Learning with Neural Networks: A Survey [0.0]
Dynamic graph representations have emerged as a new machine learning problem.
This paper aims at providing a review of problems and models related to dynamic graph learning.
arXiv Detail & Related papers (2023-04-12T09:39:17Z) - EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning [92.71579608528907]
This paper aims to design an easy-to-use pipeline (termed as EasyDGL) composed of three key modules with both strong ability fitting and interpretability.
EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.
arXiv Detail & Related papers (2023-03-22T06:35:08Z) - Learning Dynamic Graph Embeddings with Neural Controlled Differential
Equations [21.936437653875245]
This paper focuses on representation learning for dynamic graphs with temporal interactions.
We propose a generic differential model for dynamic graphs that characterises the continuously dynamic evolution of node embedding trajectories.
Our framework exhibits several desirable characteristics, including the ability to express dynamics on evolving graphs without integration by segments.
arXiv Detail & Related papers (2023-02-22T12:59:38Z) - DyTed: Disentangled Representation Learning for Discrete-time Dynamic
Graph [59.583555454424]
We propose a novel disenTangled representation learning framework for discrete-time Dynamic graphs, namely DyTed.
We specially design a temporal-clips contrastive learning task together with a structure contrastive learning to effectively identify the time-invariant and time-varying representations respectively.
arXiv Detail & Related papers (2022-10-19T14:34:12Z) - EXPERT: Public Benchmarks for Dynamic Heterogeneous Academic Graphs [5.4744970832051445]
We present a variety of large scale, dynamic heterogeneous academic graphs to test the effectiveness of models developed for graph forecasting tasks.
Our novel datasets cover both context and content information extracted from scientific publications across two communities: Artificial Intelligence (AI) and Nuclear Nonproliferation (NN)
arXiv Detail & Related papers (2022-04-14T19:43:34Z) - Learning Dual Dynamic Representations on Time-Sliced User-Item
Interaction Graphs for Sequential Recommendation [62.30552176649873]
We devise a novel Dynamic Representation Learning model for Sequential Recommendation (DRL-SRe)
To better model the user-item interactions for characterizing the dynamics from both sides, the proposed model builds a global user-item interaction graph for each time slice.
To enable the model to capture fine-grained temporal information, we propose an auxiliary temporal prediction task over consecutive time slices.
arXiv Detail & Related papers (2021-09-24T07:44:27Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.