Continuous Temporal Graph Networks for Event-Based Graph Data
- URL: http://arxiv.org/abs/2205.15924v1
- Date: Tue, 31 May 2022 16:17:02 GMT
- Title: Continuous Temporal Graph Networks for Event-Based Graph Data
- Authors: Jin Guo, Zhen Han, Zhou Su, Jiliang Li, Volker Tresp, Yuyi Wang
- Abstract summary: We propose Continuous Temporal Graph Networks (CTGNs) to capture the continuous dynamics of temporal graph data.
Key idea is to use neural ordinary differential equations (ODE) to characterize the continuous dynamics of node representations over dynamic graphs.
Experiment results on both transductive and inductive tasks demonstrate the effectiveness of our proposed approach.
- Score: 41.786721257905555
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: There has been an increasing interest in modeling continuous-time dynamics of
temporal graph data. Previous methods encode time-evolving relational
information into a low-dimensional representation by specifying discrete layers
of neural networks, while real-world dynamic graphs often vary continuously
over time. Hence, we propose Continuous Temporal Graph Networks (CTGNs) to
capture the continuous dynamics of temporal graph data. We use both the link
starting timestamps and link duration as evolving information to model the
continuous dynamics of nodes. The key idea is to use neural ordinary
differential equations (ODE) to characterize the continuous dynamics of node
representations over dynamic graphs. We parameterize ordinary differential
equations using a novel graph neural network. The existing dynamic graph
networks can be considered as a specific discretization of CTGNs. Experiment
results on both transductive and inductive tasks demonstrate the effectiveness
of our proposed approach over competitive baselines.
Related papers
- Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Learning Dynamic Graph Embeddings with Neural Controlled Differential
Equations [21.936437653875245]
This paper focuses on representation learning for dynamic graphs with temporal interactions.
We propose a generic differential model for dynamic graphs that characterises the continuously dynamic evolution of node embedding trajectories.
Our framework exhibits several desirable characteristics, including the ability to express dynamics on evolving graphs without integration by segments.
arXiv Detail & Related papers (2023-02-22T12:59:38Z) - Multi-Task Edge Prediction in Temporally-Dynamic Video Graphs [16.121140184388786]
We propose MTD-GNN, a graph network for predicting temporally-dynamic edges for multiple types of relations.
We show that modeling multiple relations in our temporal-dynamic graph network can be mutually beneficial.
arXiv Detail & Related papers (2022-12-06T10:41:00Z) - Time-aware Dynamic Graph Embedding for Asynchronous Structural Evolution [60.695162101159134]
Existing works merely view a dynamic graph as a sequence of changes.
We formulate dynamic graphs as temporal edge sequences associated with joining time of.
vertex and timespan of edges.
A time-aware Transformer is proposed to embed.
vertex' dynamic connections and ToEs into the learned.
vertex representations.
arXiv Detail & Related papers (2022-07-01T15:32:56Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Deep Dynamic Effective Connectivity Estimation from Multivariate Time
Series [0.0]
We develop dynamic effective connectivity estimation via neural network training (DECENNT)
DECENNT outperforms state-of-the-art (SOTA) methods on five different tasks and infers interpretable task-specific dynamic graphs.
arXiv Detail & Related papers (2022-02-04T21:14:21Z) - Efficient-Dyn: Dynamic Graph Representation Learning via Event-based
Temporal Sparse Attention Network [2.0047096160313456]
Dynamic graph neural networks have received more and more attention from researchers.
We propose a novel dynamic graph neural network, Efficient-Dyn.
It adaptively encodes temporal information into a sequence of patches with an equal amount of temporal-topological structure.
arXiv Detail & Related papers (2022-01-04T23:52:24Z) - Dynamic Graph Learning-Neural Network for Multivariate Time Series
Modeling [2.3022070933226217]
We propose a novel framework, namely static- and dynamic-graph learning-neural network (GL)
The model acquires static and dynamic graph matrices from data to model long-term and short-term patterns respectively.
It achieves state-of-the-art performance on almost all datasets.
arXiv Detail & Related papers (2021-12-06T08:19:15Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.