Stochastic Graph Recurrent Neural Network
- URL: http://arxiv.org/abs/2009.00538v1
- Date: Tue, 1 Sep 2020 16:14:30 GMT
- Title: Stochastic Graph Recurrent Neural Network
- Authors: Tijin Yan, Hongwei Zhang, Zirui Li, Yuanqing Xia
- Abstract summary: We propose SGRNN, a novel neural architecture that applies latent variables to simultaneously capture evolution in node attributes and topology.
Specifically, deterministic states are separated from states in the iterative process to suppress mutual interference.
Experiments on real-world datasets demonstrate the effectiveness of the proposed model.
- Score: 6.656993023468793
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Representation learning over graph structure data has been widely studied due
to its wide application prospects. However, previous methods mainly focus on
static graphs while many real-world graphs evolve over time. Modeling such
evolution is important for predicting properties of unseen networks. To resolve
this challenge, we propose SGRNN, a novel neural architecture that applies
stochastic latent variables to simultaneously capture the evolution in node
attributes and topology. Specifically, deterministic states are separated from
stochastic states in the iterative process to suppress mutual interference.
With semi-implicit variational inference integrated to SGRNN, a non-Gaussian
variational distribution is proposed to help further improve the performance.
In addition, to alleviate KL-vanishing problem in SGRNN, a simple and
interpretable structure is proposed based on the lower bound of KL-divergence.
Extensive experiments on real-world datasets demonstrate the effectiveness of
the proposed model. Code is available at
https://github.com/StochasticGRNN/SGRNN.
Related papers
- Implicit Graph Neural Diffusion Networks: Convergence, Generalization,
and Over-Smoothing [7.984586585987328]
Implicit Graph Neural Networks (GNNs) have achieved significant success in addressing graph learning problems.
We introduce a geometric framework for designing implicit graph diffusion layers based on a parameterized graph Laplacian operator.
We show how implicit GNN layers can be viewed as the fixed-point equation of a Dirichlet energy minimization problem.
arXiv Detail & Related papers (2023-08-07T05:22:33Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Graph Sequential Neural ODE Process for Link Prediction on Dynamic and
Sparse Graphs [33.294977897987685]
Link prediction on dynamic graphs is an important task in graph mining.
Existing approaches based on dynamic graph neural networks (DGNNs) typically require a significant amount of historical data.
We propose a novel method based on the neural process, called Graph Sequential Neural ODE Process (GSNOP)
arXiv Detail & Related papers (2022-11-15T23:21:02Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - GRAND: Graph Neural Diffusion [15.00135729657076]
We present Graph Neural Diffusion (GRAND) that approaches deep learning on graphs as a continuous diffusion process.
In our model, the layer structure and topology correspond to the discretisation choices of temporal and spatial operators.
Key to the success of our models are stability with respect to perturbations in the data and this is addressed for both implicit and explicit discretisation schemes.
arXiv Detail & Related papers (2021-06-21T09:10:57Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Implicit Graph Neural Networks [46.0589136729616]
We propose a graph learning framework called Implicit Graph Neural Networks (IGNN)
IGNNs consistently capture long-range dependencies and outperform state-of-the-art GNN models.
arXiv Detail & Related papers (2020-09-14T06:04:55Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Stochastic Graph Neural Networks [123.39024384275054]
Graph neural networks (GNNs) model nonlinear representations in graph data with applications in distributed agent coordination, control, and planning.
Current GNN architectures assume ideal scenarios and ignore link fluctuations that occur due to environment, human factors, or external attacks.
In these situations, the GNN fails to address its distributed task if the topological randomness is not considered accordingly.
arXiv Detail & Related papers (2020-06-04T08:00:00Z) - Multivariate Time Series Forecasting with Transfer Entropy Graph [5.179058210068871]
We propose a novel end-to-end deep learning model, termed graph neural network with Neural Granger Causality (CauGNN)
Each variable is regarded as a graph node, and each edge represents the casual relationship between variables.
Three benchmark datasets from the real world are used to evaluate the proposed CauGNN.
arXiv Detail & Related papers (2020-05-03T20:51:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.