Mutually exciting point process graphs for modelling dynamic networks
- URL: http://arxiv.org/abs/2102.06527v1
- Date: Thu, 11 Feb 2021 10:14:55 GMT
- Title: Mutually exciting point process graphs for modelling dynamic networks
- Authors: Francesco Sanna Passino, Nicholas A. Heard
- Abstract summary: A new class of models for dynamic networks is proposed, called mutually exciting point process graphs (MEG)
MEG is a scalable network-wide statistical model for point processes with dyadic marks, which can be used for anomaly detection.
The model is tested on simulated graphs and real world computer network datasets, demonstrating excellent performance.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A new class of models for dynamic networks is proposed, called mutually
exciting point process graphs (MEG), motivated by a practical application in
computer network security. MEG is a scalable network-wide statistical model for
point processes with dyadic marks, which can be used for anomaly detection when
assessing the significance of previously unobserved connections. The model
combines mutually exciting point processes to estimate dependencies between
events and latent space models to infer relationships between the nodes. The
intensity functions for each network edge are parameterised exclusively by
node-specific parameters, which allows information to be shared across the
network. Fast inferential procedures using modern gradient ascent algorithms
are exploited. The model is tested on simulated graphs and real world computer
network datasets, demonstrating excellent performance.
Related papers
- Learning the mechanisms of network growth [42.1340910148224]
We propose a novel model-selection method for dynamic networks.
Data is generated by simulating nine state-of-the-art random graph models.
Proposed features are easy to compute, analytically tractable, and interpretable.
arXiv Detail & Related papers (2024-03-31T20:38:59Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Network Intrusion Detection with Edge-Directed Graph Multi-Head Attention Networks [13.446986347747325]
This paper proposes novel Edge-Directed Graph Multi-Head Attention Networks (EDGMAT) for network intrusion detection.
The proposed EDGMAT model introduces a multi-head attention mechanism into the intrusion detection model. Additional weight learning is realized through the combination of a multi-head attention mechanism and edge features.
arXiv Detail & Related papers (2023-10-26T12:30:11Z) - A parameterised model for link prediction using node centrality and
similarity measure based on graph embedding [5.507008181141738]
Link prediction is a key aspect of graph machine learning.
It involves predicting new links that may form between network nodes.
Existing models have significant shortcomings.
We present the Node Centrality and Similarity Based.
Model (NCSM), a novel method for link prediction tasks.
arXiv Detail & Related papers (2023-09-11T13:13:54Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Optimal Connectivity through Network Gradients for the Restricted
Boltzmann Machine [0.0]
A fundamental problem is efficiently finding connectivity patterns that improve the learning curve.
Recent approaches explicitly include network connections as parameters that must be optimized in the model.
This work presents a method to find optimal connectivity patterns for RBMs based on the idea of network gradients.
arXiv Detail & Related papers (2022-09-14T21:09:58Z) - Graph similarity learning for change-point detection in dynamic networks [15.694880385913534]
We consider dynamic networks that are temporal sequences of graph snapshots.
This task is often termed network change-point detection and has numerous applications, such as fraud detection or physical motion monitoring.
We design a method to perform online network change-point detection that can adapt to the specific network domain and localise changes with no delay.
arXiv Detail & Related papers (2022-03-29T12:16:38Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - Mitigating Performance Saturation in Neural Marked Point Processes:
Architectures and Loss Functions [50.674773358075015]
We propose a simple graph-based network structure called GCHP, which utilizes only graph convolutional layers.
We show that GCHP can significantly reduce training time and the likelihood ratio loss with interarrival time probability assumptions can greatly improve the model performance.
arXiv Detail & Related papers (2021-07-07T16:59:14Z) - Dynamic Graph: Learning Instance-aware Connectivity for Neural Networks [78.65792427542672]
Dynamic Graph Network (DG-Net) is a complete directed acyclic graph, where the nodes represent convolutional blocks and the edges represent connection paths.
Instead of using the same path of the network, DG-Net aggregates features dynamically in each node, which allows the network to have more representation ability.
arXiv Detail & Related papers (2020-10-02T16:50:26Z) - Pre-Trained Models for Heterogeneous Information Networks [57.78194356302626]
We propose a self-supervised pre-training and fine-tuning framework, PF-HIN, to capture the features of a heterogeneous information network.
PF-HIN consistently and significantly outperforms state-of-the-art alternatives on each of these tasks, on four datasets.
arXiv Detail & Related papers (2020-07-07T03:36:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.