RDGSL: Dynamic Graph Representation Learning with Structure Learning
- URL: http://arxiv.org/abs/2309.02025v1
- Date: Tue, 5 Sep 2023 08:03:59 GMT
- Title: RDGSL: Dynamic Graph Representation Learning with Structure Learning
- Authors: Siwei Zhang, Yun Xiong, Yao Zhang, Yiheng Sun, Xi Chen, Yizhu Jiao and
Yangyong Zhu
- Abstract summary: Temporal Graph Networks (TGNs) have shown remarkable performance in learning representation for continuous-time dynamic graphs.
However, real-world dynamic graphs typically contain diverse and intricate noise.
Noise can significantly degrade the quality of representation generation, impeding the effectiveness of TGNs in downstream tasks.
- Score: 23.00398150548281
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Temporal Graph Networks (TGNs) have shown remarkable performance in learning
representation for continuous-time dynamic graphs. However, real-world dynamic
graphs typically contain diverse and intricate noise. Noise can significantly
degrade the quality of representation generation, impeding the effectiveness of
TGNs in downstream tasks. Though structure learning is widely applied to
mitigate noise in static graphs, its adaptation to dynamic graph settings poses
two significant challenges. i) Noise dynamics. Existing structure learning
methods are ill-equipped to address the temporal aspect of noise, hampering
their effectiveness in such dynamic and ever-changing noise patterns. ii) More
severe noise. Noise may be introduced along with multiple interactions between
two nodes, leading to the re-pollution of these nodes and consequently causing
more severe noise compared to static graphs. In this paper, we present RDGSL, a
representation learning method in continuous-time dynamic graphs. Meanwhile, we
propose dynamic graph structure learning, a novel supervisory signal that
empowers RDGSL with the ability to effectively combat noise in dynamic graphs.
To address the noise dynamics issue, we introduce the Dynamic Graph Filter,
where we innovatively propose a dynamic noise function that dynamically
captures both current and historical noise, enabling us to assess the temporal
aspect of noise and generate a denoised graph. We further propose the Temporal
Embedding Learner to tackle the challenge of more severe noise, which utilizes
an attention mechanism to selectively turn a blind eye to noisy edges and hence
focus on normal edges, enhancing the expressiveness for representation
generation that remains resilient to noise. Our method demonstrates robustness
towards downstream tasks, resulting in up to 5.1% absolute AUC improvement in
evolving classification versus the second-best baseline.
Related papers
- Adaptive Spatiotemporal Augmentation for Improving Dynamic Graph Learning [16.768825403934432]
STAA identifies nodes likely to have noisy edges intemporal dimensions.
It analyzes edge evolution through graph wavelet change rates.
Then, random walks are used to reduce the weights of noisy edges.
arXiv Detail & Related papers (2025-01-17T07:48:18Z) - You Can't Ignore Either: Unifying Structure and Feature Denoising for Robust Graph Learning [34.52299775051481]
We develop a unified graph denoising (UGD) framework to unravel the deadlock between structure and feature denoising.
Specifically, a high-order neighborhood proximity evaluation method is proposed to recognize noisy edges.
We also propose to refine noisy features with reconstruction based on a graph auto-encoder.
arXiv Detail & Related papers (2024-08-01T16:43:55Z) - RobGC: Towards Robust Graph Condensation [61.259453496191696]
Graph neural networks (GNNs) have attracted widespread attention for their impressive capability of graph representation learning.
However, the increasing prevalence of large-scale graphs presents a significant challenge for GNN training due to their computational demands.
We propose graph condensation (GC) to generate an informative compact graph that enables efficient training of GNNs while retaining performance.
arXiv Detail & Related papers (2024-06-19T04:14:57Z) - Temporal Graph Representation Learning with Adaptive Augmentation
Contrastive [12.18909612212823]
Temporal graph representation learning aims to generate low-dimensional dynamic node embeddings to capture temporal information.
We propose a novel Temporal Graph representation learning with Adaptive augmentation Contrastive (TGAC) model.
Our experiments on various real networks demonstrate that the proposed model outperforms other temporal graph representation learning methods.
arXiv Detail & Related papers (2023-11-07T11:21:16Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Self-Supervised Temporal Graph learning with Temporal and Structural Intensity Alignment [53.72873672076391]
Temporal graph learning aims to generate high-quality representations for graph-based tasks with dynamic information.
We propose a self-supervised method called S2T for temporal graph learning, which extracts both temporal and structural information.
S2T achieves at most 10.13% performance improvement compared with the state-of-the-art competitors on several datasets.
arXiv Detail & Related papers (2023-02-15T06:36:04Z) - Time-aware Random Walk Diffusion to Improve Dynamic Graph Learning [3.4012007729454816]
TiaRa is a novel diffusion-based method for augmenting a dynamic graph represented as a discrete-time sequence of graph snapshots.
We show that TiaRa effectively augments a given dynamic graph, and leads to significant improvements in dynamic GNN models for various graph datasets and tasks.
arXiv Detail & Related papers (2022-11-02T15:55:46Z) - DyTed: Disentangled Representation Learning for Discrete-time Dynamic
Graph [59.583555454424]
We propose a novel disenTangled representation learning framework for discrete-time Dynamic graphs, namely DyTed.
We specially design a temporal-clips contrastive learning task together with a structure contrastive learning to effectively identify the time-invariant and time-varying representations respectively.
arXiv Detail & Related papers (2022-10-19T14:34:12Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - On Dynamic Noise Influence in Differentially Private Learning [102.6791870228147]
Private Gradient Descent (PGD) is a commonly used private learning framework, which noises based on the Differential protocol.
Recent studies show that emphdynamic privacy schedules can improve at the final iteration, yet yet theoreticals of the effectiveness of such schedules remain limited.
This paper provides comprehensive analysis of noise influence in dynamic privacy schedules to answer these critical questions.
arXiv Detail & Related papers (2021-01-19T02:04:00Z) - Learning Node Representations from Noisy Graph Structures [38.32421350245066]
Noises prevail in real-world networks, which compromise networks to a large extent.
We propose a novel framework to learn noise-free node representations and eliminate noises simultaneously.
arXiv Detail & Related papers (2020-12-04T07:18:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.