Temporal Graph Networks for Graph Anomaly Detection in Financial Networks
- URL: http://arxiv.org/abs/2404.00060v1
- Date: Wed, 27 Mar 2024 07:17:16 GMT
- Title: Temporal Graph Networks for Graph Anomaly Detection in Financial Networks
- Authors: Yejin Kim, Youngbin Lee, Minyoung Choe, Sungju Oh, Yongjae Lee,
- Abstract summary: This paper explores the utilization of Temporal Graph Networks (TGN) for financial anomaly detection.
We compare TGN's performance against static Graph Network (GNN) baselines, as well as cutting-edge hypergraph neural network baselines.
Our results demonstrate that TGN significantly outperforms other models in terms of AUC metrics.
- Score: 28.353194998824538
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper explores the utilization of Temporal Graph Networks (TGN) for financial anomaly detection, a pressing need in the era of fintech and digitized financial transactions. We present a comprehensive framework that leverages TGN, capable of capturing dynamic changes in edges within financial networks, for fraud detection. Our study compares TGN's performance against static Graph Neural Network (GNN) baselines, as well as cutting-edge hypergraph neural network baselines using DGraph dataset for a realistic financial context. Our results demonstrate that TGN significantly outperforms other models in terms of AUC metrics. This superior performance underlines TGN's potential as an effective tool for detecting financial fraud, showcasing its ability to adapt to the dynamic and complex nature of modern financial systems. We also experimented with various graph embedding modules within the TGN framework and compared the effectiveness of each module. In conclusion, we demonstrated that, even with variations within TGN, it is possible to achieve good performance in the anomaly detection task.
Related papers
- Uncertainty in Graph Neural Networks: A Survey [50.63474656037679]
Graph Neural Networks (GNNs) have been extensively used in various real-world applications.
However, the predictive uncertainty of GNNs stemming from diverse sources can lead to unstable and erroneous predictions.
This survey aims to provide a comprehensive overview of the GNNs from the perspective of uncertainty.
arXiv Detail & Related papers (2024-03-11T21:54:52Z) - T-GAE: Transferable Graph Autoencoder for Network Alignment [79.89704126746204]
T-GAE is a graph autoencoder framework that leverages transferability and stability of GNNs to achieve efficient network alignment without retraining.
Our experiments demonstrate that T-GAE outperforms the state-of-the-art optimization method and the best GNN approach by up to 38.7% and 50.8%, respectively.
arXiv Detail & Related papers (2023-10-05T02:58:29Z) - ChatGPT Informed Graph Neural Network for Stock Movement Prediction [8.889701868315717]
We introduce a novel framework that leverages ChatGPT's graph inference capabilities to enhance Graph Neural Networks (GNN)
Our framework adeptly extracts evolving network structures from textual data, and incorporates these networks into graph neural networks for subsequent predictive tasks.
arXiv Detail & Related papers (2023-05-28T21:11:59Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Analysis of different temporal graph neural network configurations on
dynamic graphs [0.0]
This project aims to address the gap in the literature by performing a qualitative analysis of spatial-temporal dependence structure learning on dynamic graphs.
An extensive ablation study will be conducted on different variants of the best-performing TGN to identify the key factors contributing to its performance.
By achieving these objectives, this project will provide valuable insights into the design and optimization of TGNs for dynamic graph analysis.
arXiv Detail & Related papers (2023-05-02T00:07:33Z) - GraphTTA: Test Time Adaptation on Graph Neural Networks [10.582212966736645]
We present a novel test time adaptation strategy named Graph Adversarial Pseudo Group Contrast (GAPGC) for graph neural networks (GNNs)
GAPGC employs a contrastive learning variant as a self-supervised task during TTA, equipped with Adversarial Learnable Augmenter and Group Pseudo-Positive Samples.
We provide theoretical evidence that GAPGC can extract minimal sufficient information for the main task from information theory perspective.
arXiv Detail & Related papers (2022-08-19T02:24:16Z) - EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural
Networks [51.42338058718487]
Graph Neural Networks (GNNs) have received extensive research attention for their promising performance in graph machine learning.
Existing approaches, such as GCN and GPRGNN, are not robust in the face of homophily changes on test graphs.
We propose EvenNet, a spectral GNN corresponding to an even-polynomial graph filter.
arXiv Detail & Related papers (2022-05-27T10:48:14Z) - Learning to Drop: Robust Graph Neural Network via Topological Denoising [50.81722989898142]
We propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of Graph Neural Networks (GNNs)
PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks.
We show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
arXiv Detail & Related papers (2020-11-13T18:53:21Z) - Multi-Graph Tensor Networks [23.030263841031633]
We introduce a novel Multi-Graph Network (MGTN) framework, which exploits the ability of graphs to handle irregular data sources and the compression properties of tensor networks in a deep learning setting.
By virtue of the MGTN, a FOREX currency graph is leveraged to impose an economically meaningful structure on this demanding task, resulting in a highly superior performance against three competing models and at a drastically lower complexity.
arXiv Detail & Related papers (2020-10-25T20:14:57Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.