On The Temporal Domain of Differential Equation Inspired Graph Neural
Networks
- URL: http://arxiv.org/abs/2401.11074v1
- Date: Sat, 20 Jan 2024 01:12:57 GMT
- Title: On The Temporal Domain of Differential Equation Inspired Graph Neural
Networks
- Authors: Moshe Eliasof, Eldad Haber, Eran Treister, Carola-Bibiane Sch\"onlieb
- Abstract summary: We show that our model, called TDE-GNN, can capture a wide range of temporal dynamics that go beyond typical first or second-order methods.
We demonstrate the benefit of learning the temporal dependencies using our method rather than using pre-defined temporal dynamics on several graph benchmarks.
- Score: 14.779420473274737
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have demonstrated remarkable success in modeling
complex relationships in graph-structured data. A recent innovation in this
field is the family of Differential Equation-Inspired Graph Neural Networks
(DE-GNNs), which leverage principles from continuous dynamical systems to model
information flow on graphs with built-in properties such as feature smoothing
or preservation. However, existing DE-GNNs rely on first or second-order
temporal dependencies. In this paper, we propose a neural extension to those
pre-defined temporal dependencies. We show that our model, called TDE-GNN, can
capture a wide range of temporal dynamics that go beyond typical first or
second-order methods, and provide use cases where existing temporal models are
challenged. We demonstrate the benefit of learning the temporal dependencies
using our method rather than using pre-defined temporal dynamics on several
graph benchmarks.
Related papers
- DTFormer: A Transformer-Based Method for Discrete-Time Dynamic Graph Representation Learning [38.53424185696828]
The representation learning of Discrete-Time Dynamic Graphs (DTDGs) has been extensively applied to model the dynamics of temporally changing entities and their evolving connections.
This paper introduces a novel representation learning method DTFormer for DTDGs, pivoting from the traditional GNN+RNN framework to a Transformer-based architecture.
arXiv Detail & Related papers (2024-07-26T05:46:23Z) - Mending of Spatio-Temporal Dependencies in Block Adjacency Matrix [3.529869282529924]
We propose a novel end-to-end learning architecture designed to mend the temporal dependencies, resulting in a well-connected graph.
Our methodology demonstrates superior performance on benchmark datasets, such as SurgVisDom and C2D2.
arXiv Detail & Related papers (2023-10-04T06:42:33Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - An Explainer for Temporal Graph Neural Networks [27.393641343203363]
Temporal graph neural networks (TGNNs) have been widely used for modeling time-evolving graph-related tasks.
We propose a novel explainer framework for TGNN models.
arXiv Detail & Related papers (2022-09-02T04:12:40Z) - Continuous Temporal Graph Networks for Event-Based Graph Data [41.786721257905555]
We propose Continuous Temporal Graph Networks (CTGNs) to capture the continuous dynamics of temporal graph data.
Key idea is to use neural ordinary differential equations (ODE) to characterize the continuous dynamics of node representations over dynamic graphs.
Experiment results on both transductive and inductive tasks demonstrate the effectiveness of our proposed approach.
arXiv Detail & Related papers (2022-05-31T16:17:02Z) - EIGNN: Efficient Infinite-Depth Graph Neural Networks [51.97361378423152]
Graph neural networks (GNNs) are widely used for modelling graph-structured data in numerous applications.
Motivated by this limitation, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN)
We show that EIGNN has a better ability to capture long-range dependencies than recent baselines, and consistently achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-02-22T08:16:58Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Temporal Knowledge Graph Forecasting with Neural ODE [19.64877769280854]
We extend the idea of continuum-depth models to time-evolving multi-relational graph data.
Our model captures temporal information through NODE and structural information through a Graph Neural Network (GNN)
Our model achieves a continuous model in time and efficiently learns node representation for future prediction.
arXiv Detail & Related papers (2021-01-13T15:49:48Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.