Continuous-Depth Neural Models for Dynamic Graph Prediction
- URL: http://arxiv.org/abs/2106.11581v1
- Date: Tue, 22 Jun 2021 07:30:35 GMT
- Title: Continuous-Depth Neural Models for Dynamic Graph Prediction
- Authors: Michael Poli, Stefano Massaroli, Clayton M. Rabideau, Junyoung Park,
Atsushi Yamashita, Hajime Asama, Jinkyoo Park
- Abstract summary: We introduce the framework of continuous-depth graph neural networks (GNNs)
Neural graph differential equations (Neural GDEs) are formalized as the counterpart to GNNs.
Results prove the effectiveness of the proposed models across applications, such as traffic forecasting or prediction in genetic regulatory networks.
- Score: 16.89981677708299
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce the framework of continuous-depth graph neural networks (GNNs).
Neural graph differential equations (Neural GDEs) are formalized as the
counterpart to GNNs where the input-output relationship is determined by a
continuum of GNN layers, blending discrete topological structures and
differential equations. The proposed framework is shown to be compatible with
static GNN models and is extended to dynamic and stochastic settings through
hybrid dynamical system theory. Here, Neural GDEs improve performance by
exploiting the underlying dynamics geometry, further introducing the ability to
accommodate irregularly sampled data. Results prove the effectiveness of the
proposed models across applications, such as traffic forecasting or prediction
in genetic regulatory networks.
Related papers
- When Graph Neural Networks Meet Dynamic Mode Decomposition [34.16727363891593]
We introduce a family of DMD-GNN models that effectively leverage the low-rank eigenfunctions provided by the DMD algorithm.
Our work paves the path for applying advanced dynamical system analysis tools via GNNs.
arXiv Detail & Related papers (2024-10-08T01:09:48Z) - Graph Neural Reaction Diffusion Models [14.164952387868341]
We propose a novel family of Reaction GNNs based on neural RD systems.
We discuss the theoretical properties of our RDGNN, its implementation, and show that it improves or offers competitive performance to state-of-the-art methods.
arXiv Detail & Related papers (2024-06-16T09:46:58Z) - A survey of dynamic graph neural networks [26.162035361191805]
Graph neural networks (GNNs) have emerged as a powerful tool for effectively mining and learning from graph-structured data.
This paper provides a comprehensive review of the fundamental concepts, key techniques, and state-of-the-art dynamic GNN models.
arXiv Detail & Related papers (2024-04-28T15:07:48Z) - Resilient Graph Neural Networks: A Coupled Dynamical Systems Approach [12.856220339384269]
Graph Neural Networks (GNNs) have established themselves as a key component in addressing diverse graph-based tasks.
Despite their notable successes, GNNs remain susceptible to input perturbations in the form of adversarial attacks.
This paper introduces an innovative approach to fortify GNNs against adversarial perturbations through the lens of coupled dynamical systems.
arXiv Detail & Related papers (2023-11-12T20:06:48Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Graph Sequential Neural ODE Process for Link Prediction on Dynamic and
Sparse Graphs [33.294977897987685]
Link prediction on dynamic graphs is an important task in graph mining.
Existing approaches based on dynamic graph neural networks (DGNNs) typically require a significant amount of historical data.
We propose a novel method based on the neural process, called Graph Sequential Neural ODE Process (GSNOP)
arXiv Detail & Related papers (2022-11-15T23:21:02Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.