Dynamic Graph Node Classification via Time Augmentation
- URL: http://arxiv.org/abs/2212.03449v1
- Date: Wed, 7 Dec 2022 04:13:23 GMT
- Title: Dynamic Graph Node Classification via Time Augmentation
- Authors: Jiarui Sun, Mengting Gu, Chin-Chia Michael Yeh, Yujie Fan, Girish
Chowdhary, Wei Zhang
- Abstract summary: We propose the Time Augmented Graph Dynamic Neural Network (TADGNN) framework for node classification on dynamic graphs.
TADGNN consists of two modules: 1) a time augmentation module that captures the temporal evolution of nodes across time structurally, creating a time-augmentedtemporal graph, and 2) an information propagation module that learns the dynamic representations for each node across time using the constructed time-augmented graph.
Experimental results demonstrate that TADGNN framework outperforms several static and dynamic state-of-the-art (SOTA) GNN models while demonstrating superior scalability.
- Score: 15.580277876084873
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Node classification for graph-structured data aims to classify nodes whose
labels are unknown. While studies on static graphs are prevalent, few studies
have focused on dynamic graph node classification. Node classification on
dynamic graphs is challenging for two reasons. First, the model needs to
capture both structural and temporal information, particularly on dynamic
graphs with a long history and require large receptive fields. Second, model
scalability becomes a significant concern as the size of the dynamic graph
increases. To address these problems, we propose the Time Augmented Dynamic
Graph Neural Network (TADGNN) framework. TADGNN consists of two modules: 1) a
time augmentation module that captures the temporal evolution of nodes across
time structurally, creating a time-augmented spatio-temporal graph, and 2) an
information propagation module that learns the dynamic representations for each
node across time using the constructed time-augmented graph. We perform node
classification experiments on four dynamic graph benchmarks. Experimental
results demonstrate that TADGNN framework outperforms several static and
dynamic state-of-the-art (SOTA) GNN models while demonstrating superior
scalability. We also conduct theoretical and empirical analyses to validate the
efficiency of the proposed method. Our code is available at
https://sites.google.com/view/tadgnn.
Related papers
- Node-Time Conditional Prompt Learning In Dynamic Graphs [14.62182210205324]
We propose DYGPROMPT, a novel pre-training and prompt learning framework for dynamic graph modeling.
We recognize that node and time features mutually characterize each other, and propose dual condition-nets to model the evolving node-time patterns in downstream tasks.
arXiv Detail & Related papers (2024-05-22T19:10:24Z) - Deep Temporal Graph Clustering [77.02070768950145]
We propose a general framework for deep Temporal Graph Clustering (GC)
GC introduces deep clustering techniques to suit the interaction sequence-based batch-processing pattern of temporal graphs.
Our framework can effectively improve the performance of existing temporal graph learning methods.
arXiv Detail & Related papers (2023-05-18T06:17:50Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Self-Supervised Temporal Graph learning with Temporal and Structural Intensity Alignment [53.72873672076391]
Temporal graph learning aims to generate high-quality representations for graph-based tasks with dynamic information.
We propose a self-supervised method called S2T for temporal graph learning, which extracts both temporal and structural information.
S2T achieves at most 10.13% performance improvement compared with the state-of-the-art competitors on several datasets.
arXiv Detail & Related papers (2023-02-15T06:36:04Z) - ROLAND: Graph Learning Framework for Dynamic Graphs [75.96510058864463]
Graph Neural Networks (GNNs) have been successfully applied to many real-world static graphs.
Existing dynamic GNNs do not incorporate state-of-the-art designs from static GNNs.
We propose ROLAND, an effective graph representation learning framework for real-world dynamic graphs.
arXiv Detail & Related papers (2022-08-15T14:51:47Z) - Scaling Up Dynamic Graph Representation Learning via Spiking Neural
Networks [23.01100055999135]
We present a scalable framework, namely SpikeNet, to efficiently capture the temporal and structural patterns of temporal graphs.
As a low-power alternative to RNNs, SNNs explicitly model graph dynamics as spike trains of neuron populations.
SpikeNet generalizes to a large temporal graph with significantly fewer parameters and computation overheads.
arXiv Detail & Related papers (2022-08-15T09:22:15Z) - Instant Graph Neural Networks for Dynamic Graphs [18.916632816065935]
We propose Instant Graph Neural Network (InstantGNN), an incremental approach for the graph representation matrix of dynamic graphs.
Our method avoids time-consuming, repetitive computations and allows instant updates on the representation and instant predictions.
Our model achieves state-of-the-art accuracy while having orders-of-magnitude higher efficiency than existing methods.
arXiv Detail & Related papers (2022-06-03T03:27:42Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Dynamic Graph Learning-Neural Network for Multivariate Time Series
Modeling [2.3022070933226217]
We propose a novel framework, namely static- and dynamic-graph learning-neural network (GL)
The model acquires static and dynamic graph matrices from data to model long-term and short-term patterns respectively.
It achieves state-of-the-art performance on almost all datasets.
arXiv Detail & Related papers (2021-12-06T08:19:15Z) - Learning Attribute-Structure Co-Evolutions in Dynamic Graphs [28.848851822725933]
We present a novel framework called CoEvoGNN for modeling dynamic attributed graph sequence.
It preserves the impact of earlier graphs on the current graph by embedding generation through the sequence.
It has a temporal self-attention mechanism to model long-range dependencies in the evolution.
arXiv Detail & Related papers (2020-07-25T20:07:28Z) - Structural Temporal Graph Neural Networks for Anomaly Detection in
Dynamic Graphs [54.13919050090926]
We propose an end-to-end structural temporal Graph Neural Network model for detecting anomalous edges in dynamic graphs.
In particular, we first extract the $h$-hop enclosing subgraph centered on the target edge and propose the node labeling function to identify the role of each node in the subgraph.
Based on the extracted features, we utilize Gated recurrent units (GRUs) to capture the temporal information for anomaly detection.
arXiv Detail & Related papers (2020-05-15T09:17:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.