ConvDySAT: Deep Neural Representation Learning on Dynamic Graphs via
Self-Attention and Convolutional Neural Networks
- URL: http://arxiv.org/abs/2106.11430v1
- Date: Mon, 21 Jun 2021 22:13:20 GMT
- Title: ConvDySAT: Deep Neural Representation Learning on Dynamic Graphs via
Self-Attention and Convolutional Neural Networks
- Authors: Ahmad Hafez, Atulya Praphul, Yousef Jaradt, Ezani Godwin
- Abstract summary: We present ConvDySAT as an enhancement of DySAT, one of the state-of-the-art dynamic methods, by augmenting convolution neural networks with the self-attention mechanism.
We conducted single-step link prediction on a communication network and rating network, Experimental results show significant performance gains for ConvDySAT over various state-of-the-art methods.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning node representations on temporal graphs is a fundamental step to
learn real-word dynamic graphs efficiently. Real-world graphs have the nature
of continuously evolving over time, such as changing edges weights, removing
and adding nodes and appearing and disappearing of edges, while previous graph
representation learning methods focused generally on static graphs. We present
ConvDySAT as an enhancement of DySAT, one of the state-of-the-art dynamic
methods, by augmenting convolution neural networks with the self-attention
mechanism, the employed method in DySAT to express the structural and temporal
evolution. We conducted single-step link prediction on a communication network
and rating network, Experimental results show significant performance gains for
ConvDySAT over various state-of-the-art methods.
Related papers
- Gradient Transformation: Towards Efficient and Model-Agnostic Unlearning for Dynamic Graph Neural Networks [66.70786325911124]
Graph unlearning has emerged as an essential tool for safeguarding user privacy and mitigating the negative impacts of undesirable data.
With the increasing prevalence of DGNNs, it becomes imperative to investigate the implementation of dynamic graph unlearning.
We propose an effective, efficient, model-agnostic, and post-processing method to implement DGNN unlearning.
arXiv Detail & Related papers (2024-05-23T10:26:18Z) - Decoupled Graph Neural Networks for Large Dynamic Graphs [14.635923016087503]
We propose a decoupled graph neural network for large dynamic graphs.
We show that our algorithm achieves state-of-the-art performance in both kinds of dynamic graphs.
arXiv Detail & Related papers (2023-05-14T23:00:10Z) - Dynamic Graph Representation Learning via Edge Temporal States Modeling and Structure-reinforced Transformer [5.093187534912688]
We introduce the Recurrent Structure-reinforced Graph Transformer (RSGT), a novel framework for dynamic graph representation learning.
RSGT captures temporal node representations encoding both graph topology and evolving dynamics through a recurrent learning paradigm.
We show RSGT's superior performance in discrete dynamic graph representation learning, consistently outperforming existing methods in dynamic link prediction tasks.
arXiv Detail & Related papers (2023-04-20T04:12:50Z) - Dynamic Community Detection via Adversarial Temporal Graph
Representation Learning [17.487265170798974]
In this work, an adversarial temporal graph representation learning framework is proposed to detect dynamic communities from a small sample of brain network data.
In addition, the framework employs adversarial training to guide the learning of temporal graph representation and optimize the measurable modularity loss to maximize the modularity of community.
arXiv Detail & Related papers (2022-06-29T08:44:22Z) - Efficient Dynamic Graph Representation Learning at Scale [66.62859857734104]
We propose Efficient Dynamic Graph lEarning (EDGE), which selectively expresses certain temporal dependency via training loss to improve the parallelism in computations.
We show that EDGE can scale to dynamic graphs with millions of nodes and hundreds of millions of temporal events and achieve new state-of-the-art (SOTA) performance.
arXiv Detail & Related papers (2021-12-14T22:24:53Z) - Dynamic Graph Representation Learning via Graph Transformer Networks [41.570839291138114]
We propose a Transformer-based dynamic graph learning method named Dynamic Graph Transformer (DGT)
DGT has spatial-temporal encoding to effectively learn graph topology and capture implicit links.
We show that DGT presents superior performance compared with several state-of-the-art baselines.
arXiv Detail & Related papers (2021-11-19T21:44:23Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - Pre-Training on Dynamic Graph Neural Networks [26.139844652756334]
This paper proposes a pre-training method on dynamic graph neural networks (PT-DGNN)
It uses dynamic attributed graph generation tasks to simultaneously learn the structure, semantics, and evolution features of the graph.
arXiv Detail & Related papers (2021-02-24T16:06:32Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Structural Temporal Graph Neural Networks for Anomaly Detection in
Dynamic Graphs [54.13919050090926]
We propose an end-to-end structural temporal Graph Neural Network model for detecting anomalous edges in dynamic graphs.
In particular, we first extract the $h$-hop enclosing subgraph centered on the target edge and propose the node labeling function to identify the role of each node in the subgraph.
Based on the extracted features, we utilize Gated recurrent units (GRUs) to capture the temporal information for anomaly detection.
arXiv Detail & Related papers (2020-05-15T09:17:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.