Event Detection on Dynamic Graphs
- URL: http://arxiv.org/abs/2110.12148v1
- Date: Sat, 23 Oct 2021 05:52:03 GMT
- Title: Event Detection on Dynamic Graphs
- Authors: Mert Kosan, Arlei Silva, Sourav Medya, Brian Uzzi, Ambuj Singh
- Abstract summary: Event detection is a critical task for timely decision-making in graph analytics applications.
We propose DyGED, a simple yet novel deep learning model for event detection on dynamic graphs.
- Score: 4.128347119808724
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Event detection is a critical task for timely decision-making in graph
analytics applications. Despite the recent progress towards deep learning on
graphs, event detection on dynamic graphs presents particular challenges to
existing architectures. Real-life events are often associated with sudden
deviations of the normal behavior of the graph. However, existing approaches
for dynamic node embedding are unable to capture the graph-level dynamics
related to events.
In this paper, we propose DyGED, a simple yet novel deep learning model for
event detection on dynamic graphs. DyGED learns correlations between the graph
macro dynamics -- i.e. a sequence of graph-level representations -- and labeled
events. Moreover, our approach combines structural and temporal self-attention
mechanisms to account for application-specific node and time importances
effectively. Our experimental evaluation, using a representative set of
datasets, demonstrates that DyGED outperforms competing solutions in terms of
event detection accuracy by up to 8.5% while being more scalable than the top
alternatives. We also present case studies illustrating key features of our
model.
Related papers
- Retrieval Augmented Generation for Dynamic Graph Modeling [15.09162213134372]
Dynamic graph modeling is crucial for analyzing evolving patterns in various applications.
Existing approaches often integrate graph neural networks with temporal modules or redefine dynamic graph modeling as a generative sequence task.
We introduce the Retrieval-Augmented Generation for Dynamic Graph Modeling (RAG4DyG) framework, which leverages guidance from contextually and temporally analogous examples.
arXiv Detail & Related papers (2024-08-26T09:23:35Z) - DyG-Mamba: Continuous State Space Modeling on Dynamic Graphs [59.434893231950205]
Dynamic graph learning aims to uncover evolutionary laws in real-world systems.
We propose DyG-Mamba, a new continuous state space model for dynamic graph learning.
We show that DyG-Mamba achieves state-of-the-art performance on most datasets.
arXiv Detail & Related papers (2024-08-13T15:21:46Z) - TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning [92.71579608528907]
This paper aims to design an easy-to-use pipeline (termed as EasyDGL) composed of three key modules with both strong ability fitting and interpretability.
EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.
arXiv Detail & Related papers (2023-03-22T06:35:08Z) - A Graph Regularized Point Process Model For Event Propagation Sequence [2.9093633827040724]
Point process is the dominant paradigm for modeling event sequences occurring at irregular intervals.
We propose a Graph Regularized Point Process that characterizes the event interactions across nodes with neighbors.
By applying a graph regularization method, GRPP provides model interpretability by uncovering influence strengths between nodes.
arXiv Detail & Related papers (2022-11-21T04:49:59Z) - CEP3: Community Event Prediction with Neural Point Process on Graph [59.434777403325604]
We propose a novel model combining Graph Neural Networks and Marked Temporal Point Process (MTPP)
Our experiments demonstrate the superior performance of our model in terms of both model accuracy and training efficiency.
arXiv Detail & Related papers (2022-05-21T15:30:25Z) - Semi-Supervised Graph Attention Networks for Event Representation
Learning [0.0]
This paper presents GNEE (GAT Neural Event Embeddings), a method that combines Graph Attention Networks and Graph Regularization.
A statistical analysis of experimental results with five real-world event graphs and six graph embedding methods shows that our GNEE outperforms state-of-the-art semi-supervised graph embedding methods.
arXiv Detail & Related papers (2022-01-02T14:38:28Z) - Efficient Dynamic Graph Representation Learning at Scale [66.62859857734104]
We propose Efficient Dynamic Graph lEarning (EDGE), which selectively expresses certain temporal dependency via training loss to improve the parallelism in computations.
We show that EDGE can scale to dynamic graphs with millions of nodes and hundreds of millions of temporal events and achieve new state-of-the-art (SOTA) performance.
arXiv Detail & Related papers (2021-12-14T22:24:53Z) - Structural Temporal Graph Neural Networks for Anomaly Detection in
Dynamic Graphs [54.13919050090926]
We propose an end-to-end structural temporal Graph Neural Network model for detecting anomalous edges in dynamic graphs.
In particular, we first extract the $h$-hop enclosing subgraph centered on the target edge and propose the node labeling function to identify the role of each node in the subgraph.
Based on the extracted features, we utilize Gated recurrent units (GRUs) to capture the temporal information for anomaly detection.
arXiv Detail & Related papers (2020-05-15T09:17:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.