Learning Coarse-Grained Dynamics on Graph
- URL: http://arxiv.org/abs/2405.09324v1
- Date: Wed, 15 May 2024 13:25:34 GMT
- Title: Learning Coarse-Grained Dynamics on Graph
- Authors: Yin Yu, John Harlim, Daning Huang, Yan Li,
- Abstract summary: We consider a Graph Neural Network (GNN) non-Markovian modeling framework to identify coarse-grained dynamical systems on graphs.
Our main idea is to systematically determine the GNN architecture by inspecting how the leading term of the Mori-Zwanzig memory term depends on the coarse-grained interaction coefficients that encode the graph topology.
- Score: 4.692217705215042
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider a Graph Neural Network (GNN) non-Markovian modeling framework to identify coarse-grained dynamical systems on graphs. Our main idea is to systematically determine the GNN architecture by inspecting how the leading term of the Mori-Zwanzig memory term depends on the coarse-grained interaction coefficients that encode the graph topology. Based on this analysis, we found that the appropriate GNN architecture that will account for $K$-hop dynamical interactions has to employ a Message Passing (MP) mechanism with at least $2K$ steps. We also deduce that the memory length required for an accurate closure model decreases as a function of the interaction strength under the assumption that the interaction strength exhibits a power law that decays as a function of the hop distance. Supporting numerical demonstrations on two examples, a heterogeneous Kuramoto oscillator model and a power system, suggest that the proposed GNN architecture can predict the coarse-grained dynamics under fixed and time-varying graph topologies.
Related papers
- Injecting Hamiltonian Architectural Bias into Deep Graph Networks for Long-Range Propagation [55.227976642410766]
dynamics of information diffusion within graphs is a critical open issue that heavily influences graph representation learning.
Motivated by this, we introduce (port-)Hamiltonian Deep Graph Networks.
We reconcile under a single theoretical and practical framework both non-dissipative long-range propagation and non-conservative behaviors.
arXiv Detail & Related papers (2024-05-27T13:36:50Z) - Re-Think and Re-Design Graph Neural Networks in Spaces of Continuous
Graph Diffusion Functionals [7.6435511285856865]
Graph neural networks (GNNs) are widely used in domains like social networks and biological systems.
locality assumption of GNNs hampers their ability to capture long-range dependencies and global patterns in graphs.
We propose a new inductive bias based on variational analysis, drawing inspiration from the Brachchronistoe problem.
arXiv Detail & Related papers (2023-07-01T04:44:43Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - On the Ability of Graph Neural Networks to Model Interactions Between
Vertices [14.909298522361306]
Graph neural networks (GNNs) are widely used for modeling complex interactions between entities represented as vertices of a graph.
Despite recent efforts to theoretically analyze the expressive power of GNNs, a formal characterization of their ability to model interactions is lacking.
arXiv Detail & Related papers (2022-11-29T18:58:07Z) - MGNNI: Multiscale Graph Neural Networks with Implicit Layers [53.75421430520501]
implicit graph neural networks (GNNs) have been proposed to capture long-range dependencies in underlying graphs.
We introduce and justify two weaknesses of implicit GNNs: the constrained expressiveness due to their limited effective range for capturing long-range dependencies, and their lack of ability to capture multiscale information on graphs at multiple resolutions.
We propose a multiscale graph neural network with implicit layers (MGNNI) which is able to model multiscale structures on graphs and has an expanded effective range for capturing long-range dependencies.
arXiv Detail & Related papers (2022-10-15T18:18:55Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - Continuous-Depth Neural Models for Dynamic Graph Prediction [16.89981677708299]
We introduce the framework of continuous-depth graph neural networks (GNNs)
Neural graph differential equations (Neural GDEs) are formalized as the counterpart to GNNs.
Results prove the effectiveness of the proposed models across applications, such as traffic forecasting or prediction in genetic regulatory networks.
arXiv Detail & Related papers (2021-06-22T07:30:35Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.