De Bruijn goes Neural: Causality-Aware Graph Neural Networks for Time
Series Data on Dynamic Graphs
- URL: http://arxiv.org/abs/2209.08311v1
- Date: Sat, 17 Sep 2022 10:54:00 GMT
- Title: De Bruijn goes Neural: Causality-Aware Graph Neural Networks for Time
Series Data on Dynamic Graphs
- Authors: Lisi Qarkaxhija, Vincenzo Perri, Ingo Scholtes
- Abstract summary: We introduce De Bruijn Graph Neural Networks (DBGNNs) for time-resolved data on dynamic graphs.
Our approach accounts for temporal-topological patterns that unfold in the causal topology of dynamic graphs.
DBGNNs can leverage temporal patterns in dynamic graphs, which substantially improves the performance in a supervised node classification task.
- Score: 1.2891210250935143
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce De Bruijn Graph Neural Networks (DBGNNs), a novel time-aware
graph neural network architecture for time-resolved data on dynamic graphs. Our
approach accounts for temporal-topological patterns that unfold in the causal
topology of dynamic graphs, which is determined by causal walks, i.e.
temporally ordered sequences of links by which nodes can influence each other
over time. Our architecture builds on multiple layers of higher-order De Bruijn
graphs, an iterative line graph construction where nodes in a De Bruijn graph
of order k represent walks of length k-1, while edges represent walks of length
k. We develop a graph neural network architecture that utilizes De Bruijn
graphs to implement a message passing scheme that follows a non-Markovian
dynamics, which enables us to learn patterns in the causal topology of a
dynamic graph. Addressing the issue that De Bruijn graphs with different orders
k can be used to model the same data set, we further apply statistical model
selection to determine the optimal graph topology to be used for message
passing. An evaluation in synthetic and empirical data sets suggests that
DBGNNs can leverage temporal patterns in dynamic graphs, which substantially
improves the performance in a supervised node classification task.
Related papers
- Dynamic Dense Graph Convolutional Network for Skeleton-based Human
Motion Prediction [14.825185477750479]
This paper presents a Dynamic Dense Graph Convolutional Network (DD-GCN) which constructs a dense graph and implements an integrated dynamic message passing.
Based on the dense graph, we propose a dynamic message passing framework that learns dynamically from data to generate distinctive messages.
Experiments on benchmark Human 3.6M and CMU Mocap datasets verify the effectiveness of our DD-GCN.
arXiv Detail & Related papers (2023-11-29T07:25:49Z) - Graph-Level Embedding for Time-Evolving Graphs [24.194795771873046]
Graph representation learning (also known as network embedding) has been extensively researched with varying levels of granularity.
We present a novel method for temporal graph-level embedding that addresses this gap.
arXiv Detail & Related papers (2023-06-01T01:50:37Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Instant Graph Neural Networks for Dynamic Graphs [18.916632816065935]
We propose Instant Graph Neural Network (InstantGNN), an incremental approach for the graph representation matrix of dynamic graphs.
Our method avoids time-consuming, repetitive computations and allows instant updates on the representation and instant predictions.
Our model achieves state-of-the-art accuracy while having orders-of-magnitude higher efficiency than existing methods.
arXiv Detail & Related papers (2022-06-03T03:27:42Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - Dynamic Graph Learning-Neural Network for Multivariate Time Series
Modeling [2.3022070933226217]
We propose a novel framework, namely static- and dynamic-graph learning-neural network (GL)
The model acquires static and dynamic graph matrices from data to model long-term and short-term patterns respectively.
It achieves state-of-the-art performance on almost all datasets.
arXiv Detail & Related papers (2021-12-06T08:19:15Z) - Topological Relational Learning on Graphs [2.4692806302088868]
Graph neural networks (GNNs) have emerged as a powerful tool for graph classification and representation learning.
We propose a novel topological relational inference (TRI) which allows for integrating higher-order graph information to GNNs.
We show that the new TRI-GNN outperforms all 14 state-of-the-art baselines on 6 out 7 graphs and exhibit higher robustness to perturbations.
arXiv Detail & Related papers (2021-10-29T04:03:27Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Structural Temporal Graph Neural Networks for Anomaly Detection in
Dynamic Graphs [54.13919050090926]
We propose an end-to-end structural temporal Graph Neural Network model for detecting anomalous edges in dynamic graphs.
In particular, we first extract the $h$-hop enclosing subgraph centered on the target edge and propose the node labeling function to identify the role of each node in the subgraph.
Based on the extracted features, we utilize Gated recurrent units (GRUs) to capture the temporal information for anomaly detection.
arXiv Detail & Related papers (2020-05-15T09:17:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.