Self-Supervised Dynamic Graph Representation Learning via Temporal
Subgraph Contrast
- URL: http://arxiv.org/abs/2112.08733v1
- Date: Thu, 16 Dec 2021 09:35:34 GMT
- Title: Self-Supervised Dynamic Graph Representation Learning via Temporal
Subgraph Contrast
- Authors: Linpu Jiang, Ke-Jia Chen, Jingqiang Chen
- Abstract summary: This paper proposes a self-supervised dynamic graph representation learning framework (DySubC)
DySubC defines a temporal subgraph contrastive learning task to simultaneously learn the structural and evolutional features of a dynamic graph.
Experiments on five real-world datasets demonstrate that DySubC performs better than the related baselines.
- Score: 0.8379286663107846
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Self-supervised learning on graphs has recently drawn a lot of attention due
to its independence from labels and its robustness in representation. Current
studies on this topic mainly use static information such as graph structures
but cannot well capture dynamic information such as timestamps of edges.
Realistic graphs are often dynamic, which means the interaction between nodes
occurs at a specific time. This paper proposes a self-supervised dynamic graph
representation learning framework (DySubC), which defines a temporal subgraph
contrastive learning task to simultaneously learn the structural and
evolutional features of a dynamic graph. Specifically, a novel temporal
subgraph sampling strategy is firstly proposed, which takes each node of the
dynamic graph as the central node and uses both neighborhood structures and
edge timestamps to sample the corresponding temporal subgraph. The subgraph
representation function is then designed according to the influence of
neighborhood nodes on the central node after encoding the nodes in each
subgraph. Finally, the structural and temporal contrastive loss are defined to
maximize the mutual information between node representation and temporal
subgraph representation. Experiments on five real-world datasets demonstrate
that (1) DySubC performs better than the related baselines including two graph
contrastive learning models and four dynamic graph representation learning
models in the downstream link prediction task, and (2) the use of temporal
information can not only sample more effective subgraphs, but also learn better
representation by temporal contrastive loss.
Related papers
- Graph-Level Embedding for Time-Evolving Graphs [24.194795771873046]
Graph representation learning (also known as network embedding) has been extensively researched with varying levels of granularity.
We present a novel method for temporal graph-level embedding that addresses this gap.
arXiv Detail & Related papers (2023-06-01T01:50:37Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Self-Supervised Temporal Graph learning with Temporal and Structural Intensity Alignment [53.72873672076391]
Temporal graph learning aims to generate high-quality representations for graph-based tasks with dynamic information.
We propose a self-supervised method called S2T for temporal graph learning, which extracts both temporal and structural information.
S2T achieves at most 10.13% performance improvement compared with the state-of-the-art competitors on several datasets.
arXiv Detail & Related papers (2023-02-15T06:36:04Z) - Dynamic Graph Node Classification via Time Augmentation [15.580277876084873]
We propose the Time Augmented Graph Dynamic Neural Network (TADGNN) framework for node classification on dynamic graphs.
TADGNN consists of two modules: 1) a time augmentation module that captures the temporal evolution of nodes across time structurally, creating a time-augmentedtemporal graph, and 2) an information propagation module that learns the dynamic representations for each node across time using the constructed time-augmented graph.
Experimental results demonstrate that TADGNN framework outperforms several static and dynamic state-of-the-art (SOTA) GNN models while demonstrating superior scalability.
arXiv Detail & Related papers (2022-12-07T04:13:23Z) - DyTed: Disentangled Representation Learning for Discrete-time Dynamic
Graph [59.583555454424]
We propose a novel disenTangled representation learning framework for discrete-time Dynamic graphs, namely DyTed.
We specially design a temporal-clips contrastive learning task together with a structure contrastive learning to effectively identify the time-invariant and time-varying representations respectively.
arXiv Detail & Related papers (2022-10-19T14:34:12Z) - Time-aware Dynamic Graph Embedding for Asynchronous Structural Evolution [60.695162101159134]
Existing works merely view a dynamic graph as a sequence of changes.
We formulate dynamic graphs as temporal edge sequences associated with joining time of.
vertex and timespan of edges.
A time-aware Transformer is proposed to embed.
vertex' dynamic connections and ToEs into the learned.
vertex representations.
arXiv Detail & Related papers (2022-07-01T15:32:56Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - Graph-Time Convolutional Neural Networks [9.137554315375919]
We represent spatial relationships through product graphs with a first principle graph-time convolutional neural network (GTCNN)
We develop a graph-time convolutional filter by following the shift-and-sumtemporal operator to learn higher-level features over the product graph.
We develop a zero-pad pooling that preserves the spatial graph while reducing the number of active nodes and the parameters.
arXiv Detail & Related papers (2021-03-02T14:03:44Z) - Structural Temporal Graph Neural Networks for Anomaly Detection in
Dynamic Graphs [54.13919050090926]
We propose an end-to-end structural temporal Graph Neural Network model for detecting anomalous edges in dynamic graphs.
In particular, we first extract the $h$-hop enclosing subgraph centered on the target edge and propose the node labeling function to identify the role of each node in the subgraph.
Based on the extracted features, we utilize Gated recurrent units (GRUs) to capture the temporal information for anomaly detection.
arXiv Detail & Related papers (2020-05-15T09:17:08Z) - Inductive Representation Learning on Temporal Graphs [33.44276155380476]
temporal dynamic graphs require handling new nodes as well as capturing temporal patterns.
We propose the temporal graph attention layer to efficiently aggregate temporal-topological neighborhood features.
By stacking TGAT layers, the network recognizes the node embeddings as functions of time and is able to inductively infer embeddings for both new and observed nodes.
arXiv Detail & Related papers (2020-02-19T02:05:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.