Autoregressive GNN-ODE GRU Model for Network Dynamics
- URL: http://arxiv.org/abs/2211.10594v1
- Date: Sat, 19 Nov 2022 05:43:10 GMT
- Title: Autoregressive GNN-ODE GRU Model for Network Dynamics
- Authors: Bo Liang, Lin Wang, Xiaofan Wang
- Abstract summary: We propose an Autoregressive GNN-ODE GRU Model (AGOG) to learn and capture the continuous network dynamics.
Our model can capture the continuous dynamic process of complex systems accurately and make predictions of node states with minimal error.
- Score: 7.272158647379444
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Revealing the continuous dynamics on the networks is essential for
understanding, predicting, and even controlling complex systems, but it is hard
to learn and model the continuous network dynamics because of complex and
unknown governing equations, high dimensions of complex systems, and
unsatisfactory observations. Moreover, in real cases, observed time-series data
are usually non-uniform and sparse, which also causes serious challenges. In
this paper, we propose an Autoregressive GNN-ODE GRU Model (AGOG) to learn and
capture the continuous network dynamics and realize predictions of node states
at an arbitrary time in a data-driven manner. The GNN module is used to model
complicated and nonlinear network dynamics. The hidden state of node states is
specified by the ODE system, and the augmented ODE system is utilized to map
the GNN into the continuous time domain. The hidden state is updated through
GRUCell by observations. As prior knowledge, the true observations at the same
timestamp are combined with the hidden states for the next prediction. We use
the autoregressive model to make a one-step ahead prediction based on
observation history. The prediction is achieved by solving an initial-value
problem for ODE. To verify the performance of our model, we visualize the
learned dynamics and test them in three tasks: interpolation reconstruction,
extrapolation prediction, and regular sequences prediction. The results
demonstrate that our model can capture the continuous dynamic process of
complex systems accurately and make precise predictions of node states with
minimal error. Our model can consistently outperform other baselines or achieve
comparable performance.
Related papers
- DyG-Mamba: Continuous State Space Modeling on Dynamic Graphs [59.434893231950205]
Dynamic graph learning aims to uncover evolutionary laws in real-world systems.
We propose DyG-Mamba, a new continuous state space model for dynamic graph learning.
We show that DyG-Mamba achieves state-of-the-art performance on most datasets.
arXiv Detail & Related papers (2024-08-13T15:21:46Z) - Higher-order Spatio-temporal Physics-incorporated Graph Neural Network for Multivariate Time Series Imputation [9.450743095412896]
Missing values is an essential but challenging issue due to the complex latent-temporal correlation and dynamic nature of time series.
We propose a higher-ordertemporal physics-incorporated Graph Neural Networks (HSPGNN) to address this problem.
HSPGNN provides better dynamic analysis and explanation than traditional data-driven models.
arXiv Detail & Related papers (2024-05-16T16:35:43Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Brain-Inspired Spiking Neural Network for Online Unsupervised Time
Series Prediction [13.521272923545409]
We present a novel Continuous Learning-based Unsupervised Recurrent Spiking Neural Network Model (CLURSNN)
CLURSNN makes online predictions by reconstructing the underlying dynamical system using Random Delay Embedding.
We show that the proposed online time series prediction methodology outperforms state-of-the-art DNN models when predicting an evolving Lorenz63 dynamical system.
arXiv Detail & Related papers (2023-04-10T16:18:37Z) - Temporal Graph Neural Networks for Irregular Data [14.653008985229615]
TGNN4I model is designed to handle both irregular time steps and partial observations of the graph.
Time-continuous dynamics enables the model to make predictions at arbitrary time steps.
Experiments on simulated data and real-world data from traffic and climate modeling validate the usefulness of both the graph structure and time-continuous dynamics.
arXiv Detail & Related papers (2023-02-16T16:47:55Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Streaming Graph Neural Networks via Continual Learning [31.810308087441445]
Graph neural networks (GNNs) have achieved strong performance in various applications.
In this paper, we propose a streaming GNN model based on continual learning.
We show that our model can efficiently update model parameters and achieve comparable performance to model retraining.
arXiv Detail & Related papers (2020-09-23T06:52:30Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.