Edge Conditional Node Update Graph Neural Network for Multi-variate Time
Series Anomaly Detection
- URL: http://arxiv.org/abs/2401.13872v1
- Date: Thu, 25 Jan 2024 00:47:44 GMT
- Title: Edge Conditional Node Update Graph Neural Network for Multi-variate Time
Series Anomaly Detection
- Authors: Hayoung Jo and Seong-Whan Lee
- Abstract summary: We introduce the Edge Conditional Node-update Graph Neural Network (ECNU-GNN)
Our model, equipped with an edge conditional node update module, dynamically transforms source node representations based on connected edges to represent target nodes aptly.
We validate performance on three real-world datasets: SWaT, WADI, and PSM.
- Score: 35.299242563565315
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: With the rapid advancement in cyber-physical systems, the increasing number
of sensors has significantly complicated manual monitoring of system states.
Consequently, graph-based time-series anomaly detection methods have gained
attention due to their ability to explicitly represent relationships between
sensors. However, these methods often apply a uniform source node
representation across all connected target nodes, even when updating different
target node representations. Moreover, the graph attention mechanism, commonly
used to infer unknown graph structures, could constrain the diversity of source
node representations. In this paper, we introduce the Edge Conditional
Node-update Graph Neural Network (ECNU-GNN). Our model, equipped with an edge
conditional node update module, dynamically transforms source node
representations based on connected edges to represent target nodes aptly. We
validate performance on three real-world datasets: SWaT, WADI, and PSM. Our
model demonstrates 5.4%, 12.4%, and 6.0% higher performance, respectively,
compared to best F1 baseline models.
Related papers
- Conditional Local Feature Encoding for Graph Neural Networks [14.983942698240293]
Graph neural networks (GNNs) have shown great success in learning from graph-based data.
The key mechanism of current GNNs is message passing, where a node's feature is updated based on the information passing from its local neighbourhood.
We propose conditional local feature encoding (CLFE) to help prevent the problem of node features being dominated by information from local neighbourhood.
arXiv Detail & Related papers (2024-05-08T01:51:19Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Network Intrusion Detection with Edge-Directed Graph Multi-Head Attention Networks [13.446986347747325]
This paper proposes novel Edge-Directed Graph Multi-Head Attention Networks (EDGMAT) for network intrusion detection.
The proposed EDGMAT model introduces a multi-head attention mechanism into the intrusion detection model. Additional weight learning is realized through the combination of a multi-head attention mechanism and edge features.
arXiv Detail & Related papers (2023-10-26T12:30:11Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Feature Correlation Aggregation: on the Path to Better Graph Neural
Networks [37.79964911718766]
Prior to the introduction of Graph Neural Networks (GNNs), modeling and analyzing irregular data, particularly graphs, was thought to be the Achilles' heel of deep learning.
This paper introduces a central node permutation variant function through a frustratingly simple and innocent-looking modification to the core operation of a GNN.
A tangible boost in performance of the model is observed where the model surpasses previous state-of-the-art results by a significant margin while employing fewer parameters.
arXiv Detail & Related papers (2021-09-20T05:04:26Z) - Missing Data Estimation in Temporal Multilayer Position-aware Graph
Neural Network (TMP-GNN) [5.936402320555635]
Temporal Multilayered Position-aware Graph Neural Network (TMP-GNN) is a node embedding approach for dynamic graph.
We evaluate the performance of TMP-GNN on two different representations of temporal multilayered graphs.
We incorporate TMP-GNN into a deep learning framework to estimate missing data and compare the performance with their corresponding competent GNNs.
arXiv Detail & Related papers (2021-08-07T08:32:40Z) - Node Similarity Preserving Graph Convolutional Networks [51.520749924844054]
Graph Neural Networks (GNNs) explore the graph structure and node features by aggregating and transforming information within node neighborhoods.
We propose SimP-GCN that can effectively and efficiently preserve node similarity while exploiting graph structure.
We validate the effectiveness of SimP-GCN on seven benchmark datasets including three assortative and four disassorative graphs.
arXiv Detail & Related papers (2020-11-19T04:18:01Z) - CatGCN: Graph Convolutional Networks with Categorical Node Features [99.555850712725]
CatGCN is tailored for graph learning when the node features are categorical.
We train CatGCN in an end-to-end fashion and demonstrate it on semi-supervised node classification.
arXiv Detail & Related papers (2020-09-11T09:25:17Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Structural Temporal Graph Neural Networks for Anomaly Detection in
Dynamic Graphs [54.13919050090926]
We propose an end-to-end structural temporal Graph Neural Network model for detecting anomalous edges in dynamic graphs.
In particular, we first extract the $h$-hop enclosing subgraph centered on the target edge and propose the node labeling function to identify the role of each node in the subgraph.
Based on the extracted features, we utilize Gated recurrent units (GRUs) to capture the temporal information for anomaly detection.
arXiv Detail & Related papers (2020-05-15T09:17:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.