Signed Graph Neural Ordinary Differential Equation for Modeling
Continuous-time Dynamics
- URL: http://arxiv.org/abs/2312.11198v1
- Date: Mon, 18 Dec 2023 13:45:33 GMT
- Title: Signed Graph Neural Ordinary Differential Equation for Modeling
Continuous-time Dynamics
- Authors: Lanlan Chen, Kai Wu, Jian Lou, Jing Liu
- Abstract summary: The prevailing approach of integrating graph neural networks with ordinary differential equations has demonstrated promising performance.
We introduce a novel approach: a signed graph neural ordinary differential equation, adeptly addressing the limitations of miscapturing signed information.
Our proposed solution boasts both flexibility and efficiency.
- Score: 13.912268915939656
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Modeling continuous-time dynamics constitutes a foundational challenge, and
uncovering inter-component correlations within complex systems holds promise
for enhancing the efficacy of dynamic modeling. The prevailing approach of
integrating graph neural networks with ordinary differential equations has
demonstrated promising performance. However, they disregard the crucial signed
information intrinsic to graphs, impeding their capacity to accurately capture
real-world phenomena and leading to subpar outcomes.
In response, we introduce a novel approach: a signed graph neural ordinary
differential equation, adeptly addressing the limitations of miscapturing
signed information. Our proposed solution boasts both flexibility and
efficiency. To substantiate its effectiveness, we seamlessly integrate our
devised strategies into three preeminent graph-based dynamic modeling
frameworks: graph neural ordinary differential equations, graph neural
controlled differential equations, and graph recurrent neural networks.
Rigorous assessments encompass three intricate dynamic scenarios from physics
and biology, as well as scrutiny across four authentic real-world traffic
datasets. Remarkably outperforming the trio of baselines, empirical results
underscore the substantial performance enhancements facilitated by our proposed
approach.Our code can be found at https://github.com/beautyonce/SGODE.
Related papers
- Gradient Transformation: Towards Efficient and Model-Agnostic Unlearning for Dynamic Graph Neural Networks [66.70786325911124]
Graph unlearning has emerged as an essential tool for safeguarding user privacy and mitigating the negative impacts of undesirable data.
With the increasing prevalence of DGNNs, it becomes imperative to investigate the implementation of dynamic graph unlearning.
We propose an effective, efficient, model-agnostic, and post-processing method to implement DGNN unlearning.
arXiv Detail & Related papers (2024-05-23T10:26:18Z) - Resilient Graph Neural Networks: A Coupled Dynamical Systems Approach [12.856220339384269]
Graph Neural Networks (GNNs) have established themselves as a key component in addressing diverse graph-based tasks.
Despite their notable successes, GNNs remain susceptible to input perturbations in the form of adversarial attacks.
This paper introduces an innovative approach to fortify GNNs against adversarial perturbations through the lens of coupled dynamical systems.
arXiv Detail & Related papers (2023-11-12T20:06:48Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Dynamic Graph Representation Learning via Edge Temporal States Modeling and Structure-reinforced Transformer [5.093187534912688]
We introduce the Recurrent Structure-reinforced Graph Transformer (RSGT), a novel framework for dynamic graph representation learning.
RSGT captures temporal node representations encoding both graph topology and evolving dynamics through a recurrent learning paradigm.
We show RSGT's superior performance in discrete dynamic graph representation learning, consistently outperforming existing methods in dynamic link prediction tasks.
arXiv Detail & Related papers (2023-04-20T04:12:50Z) - Dynamic Graph Representation Learning with Neural Networks: A Survey [0.0]
Dynamic graph representations have emerged as a new machine learning problem.
This paper aims at providing a review of problems and models related to dynamic graph learning.
arXiv Detail & Related papers (2023-04-12T09:39:17Z) - EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning [92.71579608528907]
This paper aims to design an easy-to-use pipeline (termed as EasyDGL) composed of three key modules with both strong ability fitting and interpretability.
EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.
arXiv Detail & Related papers (2023-03-22T06:35:08Z) - Learning Dynamic Graph Embeddings with Neural Controlled Differential
Equations [21.936437653875245]
This paper focuses on representation learning for dynamic graphs with temporal interactions.
We propose a generic differential model for dynamic graphs that characterises the continuously dynamic evolution of node embedding trajectories.
Our framework exhibits several desirable characteristics, including the ability to express dynamics on evolving graphs without integration by segments.
arXiv Detail & Related papers (2023-02-22T12:59:38Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.