Anomaly Detection in Dynamic Graphs via Transformer
- URL: http://arxiv.org/abs/2106.09876v1
- Date: Fri, 18 Jun 2021 02:27:19 GMT
- Title: Anomaly Detection in Dynamic Graphs via Transformer
- Authors: Yixin Liu, Shirui Pan, Yu Guang Wang, Fei Xiong, Liang Wang, Vincent
CS Lee
- Abstract summary: We present a novel Transformer-based Anomaly Detection framework for DYnamic graph (TADDY)
Our framework constructs a comprehensive node encoding strategy to better represent each node's structural and temporal roles in an evolving graphs stream.
Our proposed TADDY framework outperforms the state-of-the-art methods by a large margin on four real-world datasets.
- Score: 30.926884264054042
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Detecting anomalies for dynamic graphs has drawn increasing attention due to
their wide applications in social networks, e-commerce, and cybersecurity. The
recent deep learning-based approaches have shown promising results over shallow
methods. However, they fail to address two core challenges of anomaly detection
in dynamic graphs: the lack of informative encoding for unattributed nodes and
the difficulty of learning discriminate knowledge from coupled spatial-temporal
dynamic graphs. To overcome these challenges, in this paper, we present a novel
Transformer-based Anomaly Detection framework for DYnamic graph (TADDY). Our
framework constructs a comprehensive node encoding strategy to better represent
each node's structural and temporal roles in an evolving graphs stream.
Meanwhile, TADDY captures informative representation from dynamic graphs with
coupled spatial-temporal patterns via a dynamic graph transformer model. The
extensive experimental results demonstrate that our proposed TADDY framework
outperforms the state-of-the-art methods by a large margin on four real-world
datasets.
Related papers
- Information propagation dynamics in Deep Graph Networks [1.8130068086063336]
Deep Graph Networks (DGNs) have emerged as a family of deep learning models that can process and learn structured information.
This thesis investigates the dynamics of information propagation within DGNs for static and dynamic graphs, focusing on their design as dynamical systems.
arXiv Detail & Related papers (2024-10-14T12:55:51Z) - Anomaly Detection in Dynamic Graphs: A Comprehensive Survey [0.23020018305241333]
This survey paper presents a comprehensive and conceptual overview of anomaly detection using dynamic graphs.
We focus on existing graph-based anomaly detection (AD) techniques and their applications to dynamic networks.
arXiv Detail & Related papers (2024-05-31T18:54:00Z) - Gradient Transformation: Towards Efficient and Model-Agnostic Unlearning for Dynamic Graph Neural Networks [66.70786325911124]
Graph unlearning has emerged as an essential tool for safeguarding user privacy and mitigating the negative impacts of undesirable data.
With the increasing prevalence of DGNNs, it becomes imperative to investigate the implementation of dynamic graph unlearning.
We propose an effective, efficient, model-agnostic, and post-processing method to implement DGNN unlearning.
arXiv Detail & Related papers (2024-05-23T10:26:18Z) - GADY: Unsupervised Anomaly Detection on Dynamic Graphs [18.1896489628884]
We propose a continuous dynamic graph model to capture the fine-grained information, which breaks the limit of existing discrete methods.
For the second challenge, we pioneer the use of Generative Adversarial Networks to generate negative interactions.
Our proposed GADY significantly outperforms the previous state-of-the-art method on three real-world datasets.
arXiv Detail & Related papers (2023-10-25T05:27:45Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Dynamic Graph Representation Learning via Edge Temporal States Modeling and Structure-reinforced Transformer [5.093187534912688]
We introduce the Recurrent Structure-reinforced Graph Transformer (RSGT), a novel framework for dynamic graph representation learning.
RSGT captures temporal node representations encoding both graph topology and evolving dynamics through a recurrent learning paradigm.
We show RSGT's superior performance in discrete dynamic graph representation learning, consistently outperforming existing methods in dynamic link prediction tasks.
arXiv Detail & Related papers (2023-04-20T04:12:50Z) - DyTed: Disentangled Representation Learning for Discrete-time Dynamic
Graph [59.583555454424]
We propose a novel disenTangled representation learning framework for discrete-time Dynamic graphs, namely DyTed.
We specially design a temporal-clips contrastive learning task together with a structure contrastive learning to effectively identify the time-invariant and time-varying representations respectively.
arXiv Detail & Related papers (2022-10-19T14:34:12Z) - Efficient Dynamic Graph Representation Learning at Scale [66.62859857734104]
We propose Efficient Dynamic Graph lEarning (EDGE), which selectively expresses certain temporal dependency via training loss to improve the parallelism in computations.
We show that EDGE can scale to dynamic graphs with millions of nodes and hundreds of millions of temporal events and achieve new state-of-the-art (SOTA) performance.
arXiv Detail & Related papers (2021-12-14T22:24:53Z) - Dynamic Graph Representation Learning via Graph Transformer Networks [41.570839291138114]
We propose a Transformer-based dynamic graph learning method named Dynamic Graph Transformer (DGT)
DGT has spatial-temporal encoding to effectively learn graph topology and capture implicit links.
We show that DGT presents superior performance compared with several state-of-the-art baselines.
arXiv Detail & Related papers (2021-11-19T21:44:23Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - Structural Temporal Graph Neural Networks for Anomaly Detection in
Dynamic Graphs [54.13919050090926]
We propose an end-to-end structural temporal Graph Neural Network model for detecting anomalous edges in dynamic graphs.
In particular, we first extract the $h$-hop enclosing subgraph centered on the target edge and propose the node labeling function to identify the role of each node in the subgraph.
Based on the extracted features, we utilize Gated recurrent units (GRUs) to capture the temporal information for anomaly detection.
arXiv Detail & Related papers (2020-05-15T09:17:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.