Interpreting Temporal Graph Neural Networks with Koopman Theory
- URL: http://arxiv.org/abs/2410.13469v1
- Date: Thu, 17 Oct 2024 11:56:33 GMT
- Title: Interpreting Temporal Graph Neural Networks with Koopman Theory
- Authors: Michele Guerra, Simone Scardapane, Filippo Maria Bianchi,
- Abstract summary: We introduce an explainability approach for temporal graphs.
We present two methods to interpret the STGNN's decision process.
We show how our methods can correctly identify interpretable features such as infection times and infected nodes in the context of dissemination processes.
- Score: 9.088336125738385
- License:
- Abstract: Spatiotemporal graph neural networks (STGNNs) have shown promising results in many domains, from forecasting to epidemiology. However, understanding the dynamics learned by these models and explaining their behaviour is significantly more complex than for models dealing with static data. Inspired by Koopman theory, which allows a simpler description of intricate, nonlinear dynamical systems, we introduce an explainability approach for temporal graphs. We present two methods to interpret the STGNN's decision process and identify the most relevant spatial and temporal patterns in the input for the task at hand. The first relies on dynamic mode decomposition (DMD), a Koopman-inspired dimensionality reduction method. The second relies on sparse identification of nonlinear dynamics (SINDy), a popular method for discovering governing equations, which we use for the first time as a general tool for explainability. We show how our methods can correctly identify interpretable features such as infection times and infected nodes in the context of dissemination processes.
Related papers
- Expressive Symbolic Regression for Interpretable Models of Discrete-Time Dynamical Systems [0.0]
Symbolic Artificial Neural Network-Trained Expressions (SymANNTEx) architecture for this task.
We show that our modified SymANNTEx model properly identifies single-state maps and achieves moderate success in approximating a dual-state attractor.
arXiv Detail & Related papers (2024-06-05T05:05:29Z) - On The Temporal Domain of Differential Equation Inspired Graph Neural
Networks [14.779420473274737]
We show that our model, called TDE-GNN, can capture a wide range of temporal dynamics that go beyond typical first or second-order methods.
We demonstrate the benefit of learning the temporal dependencies using our method rather than using pre-defined temporal dynamics on several graph benchmarks.
arXiv Detail & Related papers (2024-01-20T01:12:57Z) - Neural Koopman prior for data assimilation [7.875955593012905]
We use a neural network architecture to embed dynamical systems in latent spaces.
We introduce methods that enable to train such a model for long-term continuous reconstruction.
The potential for self-supervised learning is also demonstrated, as we show the promising use of trained dynamical models as priors for variational data assimilation techniques.
arXiv Detail & Related papers (2023-09-11T09:04:36Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - GDBN: a Graph Neural Network Approach to Dynamic Bayesian Network [7.876789380671075]
We propose a graph neural network approach with score-based method aiming at learning a sparse DAG.
We demonstrate methods with graph neural network significantly outperformed other state-of-the-art methods with dynamic bayesian networking inference.
arXiv Detail & Related papers (2023-01-28T02:49:13Z) - Direct Embedding of Temporal Network Edges via Time-Decayed Line Graphs [51.51417735550026]
Methods for machine learning on temporal networks generally exhibit at least one of two limitations.
We present a simple method that avoids both shortcomings: construct the line graph of the network, which includes a node for each interaction, and weigh the edges of this graph based on the difference in time between interactions.
Empirical results on real-world networks demonstrate our method's efficacy and efficiency on both edge classification and temporal link prediction.
arXiv Detail & Related papers (2022-09-30T18:24:13Z) - Neural ODEs with Irregular and Noisy Data [8.349349605334316]
We discuss a methodology to learn differential equation(s) using noisy and irregular sampled measurements.
In our methodology, the main innovation can be seen in the integration of deep neural networks with the neural ordinary differential equations (ODEs) approach.
The proposed framework to learn a model describing the vector field is highly effective under noisy measurements.
arXiv Detail & Related papers (2022-05-19T11:24:41Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.