Graph Neural Networks in Real-Time Fraud Detection with Lambda
Architecture
- URL: http://arxiv.org/abs/2110.04559v1
- Date: Sat, 9 Oct 2021 12:57:18 GMT
- Title: Graph Neural Networks in Real-Time Fraud Detection with Lambda
Architecture
- Authors: Mingxuan Lu, Zhichao Han, Zitao Zhang, Yang Zhao, Yinan Shan
- Abstract summary: We first present a novel Directed Dynamic Snapshot (DDS) linkage design for graph construction and a Lambda Neural Networks (LNN) architecture for effective inference with Graph Neural Networks embeddings.
Experiments show that our LNN on graph, outperforms baseline models significantly and is computational efficient for real-time fraud detection.
- Score: 14.435076554010985
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Transaction checkout fraud detection is an essential risk control components
for E-commerce marketplaces. In order to leverage graph networks to decrease
fraud rate efficiently and guarantee the information flow passed through
neighbors only from the past of the checkouts, we first present a novel
Directed Dynamic Snapshot (DDS) linkage design for graph construction and a
Lambda Neural Networks (LNN) architecture for effective inference with Graph
Neural Networks embeddings. Experiments show that our LNN on DDS graph,
outperforms baseline models significantly and is computational efficient for
real-time fraud detection.
Related papers
- Advanced Financial Fraud Detection Using GNN-CL Model [13.5240775562349]
The innovative GNN-CL model proposed in this paper marks a breakthrough in the field of financial fraud detection.
It combines the advantages of graph neural networks (gnn), convolutional neural networks (cnn) and long short-term memory (LSTM) networks.
A key novelty of this paper is the use of multilayer perceptrons (MLPS) to estimate node similarity.
arXiv Detail & Related papers (2024-07-09T03:59:06Z) - Applying Self-supervised Learning to Network Intrusion Detection for
Network Flows with Graph Neural Network [8.318363497010969]
This paper studies the application of GNNs to identify the specific types of network flows in an unsupervised manner.
To the best of our knowledge, it is the first GNN-based self-supervised method for the multiclass classification of network flows in NIDS.
arXiv Detail & Related papers (2024-03-03T12:34:13Z) - Transaction Fraud Detection via Spatial-Temporal-Aware Graph Transformer [5.043422340181098]
We propose a novel graph neural network called Spatial-Temporal-Aware Graph Transformer (STA-GT) for transaction fraud detection problems.
Specifically, we design a temporal encoding strategy to capture temporal dependencies and incorporate it into the graph neural network framework.
We introduce a transformer module to learn local and global information.
arXiv Detail & Related papers (2023-07-11T08:56:53Z) - Influencer Detection with Dynamic Graph Neural Networks [56.1837101824783]
We investigate different dynamic Graph Neural Networks (GNNs) configurations for influencer detection.
We show that using deep multi-head attention in GNN and encoding temporal attributes significantly improves performance.
arXiv Detail & Related papers (2022-11-15T13:00:25Z) - Anomal-E: A Self-Supervised Network Intrusion Detection System based on
Graph Neural Networks [0.0]
This paper investigates Graph Neural Networks (GNNs) application for self-supervised network intrusion and anomaly detection.
GNNs are a deep learning approach for graph-based data that incorporate graph structures into learning.
We present Anomal-E, a GNN approach to intrusion and anomaly detection that leverages edge features and graph topological structure in a self-supervised process.
arXiv Detail & Related papers (2022-07-14T10:59:39Z) - Edge Graph Neural Networks for Massive MIMO Detection [15.970981766599035]
Massive Multiple-Input Multiple-Out (MIMO) detection is an important problem in modern wireless communication systems.
While traditional Belief Propagation (BP) detectors perform poorly on loopy graphs, the recent Graph Neural Networks (GNNs)-based method can overcome the drawbacks of BP and achieve superior performance.
arXiv Detail & Related papers (2022-05-22T08:01:47Z) - BScNets: Block Simplicial Complex Neural Networks [79.81654213581977]
Simplicial neural networks (SNN) have recently emerged as the newest direction in graph learning.
We present Block Simplicial Complex Neural Networks (BScNets) model for link prediction.
BScNets outperforms state-of-the-art models by a significant margin while maintaining low costs.
arXiv Detail & Related papers (2021-12-13T17:35:54Z) - Deep Fraud Detection on Non-attributed Graph [61.636677596161235]
Graph Neural Networks (GNNs) have shown solid performance on fraud detection.
labeled data is scarce in large-scale industrial problems, especially for fraud detection.
We propose a novel graph pre-training strategy to leverage more unlabeled data.
arXiv Detail & Related papers (2021-10-04T03:42:09Z) - An Introduction to Robust Graph Convolutional Networks [71.68610791161355]
We propose a novel Robust Graph Convolutional Neural Networks for possible erroneous single-view or multi-view data.
By incorporating an extra layers via Autoencoders into traditional graph convolutional networks, we characterize and handle typical error models explicitly.
arXiv Detail & Related papers (2021-03-27T04:47:59Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.