Implicit Graph Neural Networks
- URL: http://arxiv.org/abs/2009.06211v3
- Date: Tue, 1 Jun 2021 07:21:32 GMT
- Title: Implicit Graph Neural Networks
- Authors: Fangda Gu, Heng Chang, Wenwu Zhu, Somayeh Sojoudi, Laurent El Ghaoui
- Abstract summary: We propose a graph learning framework called Implicit Graph Neural Networks (IGNN)
IGNNs consistently capture long-range dependencies and outperform state-of-the-art GNN models.
- Score: 46.0589136729616
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Neural Networks (GNNs) are widely used deep learning models that learn
meaningful representations from graph-structured data. Due to the finite nature
of the underlying recurrent structure, current GNN methods may struggle to
capture long-range dependencies in underlying graphs. To overcome this
difficulty, we propose a graph learning framework, called Implicit Graph Neural
Networks (IGNN), where predictions are based on the solution of a fixed-point
equilibrium equation involving implicitly defined "state" vectors. We use the
Perron-Frobenius theory to derive sufficient conditions that ensure
well-posedness of the framework. Leveraging implicit differentiation, we derive
a tractable projected gradient descent method to train the framework.
Experiments on a comprehensive range of tasks show that IGNNs consistently
capture long-range dependencies and outperform the state-of-the-art GNN models.
Related papers
- Implicit Graph Neural Diffusion Networks: Convergence, Generalization,
and Over-Smoothing [7.984586585987328]
Implicit Graph Neural Networks (GNNs) have achieved significant success in addressing graph learning problems.
We introduce a geometric framework for designing implicit graph diffusion layers based on a parameterized graph Laplacian operator.
We show how implicit GNN layers can be viewed as the fixed-point equation of a Dirichlet energy minimization problem.
arXiv Detail & Related papers (2023-08-07T05:22:33Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Structural Explanations for Graph Neural Networks using HSIC [21.929646888419914]
Graph neural networks (GNNs) are a type of neural model that tackle graphical tasks in an end-to-end manner.
The complicated dynamics of GNNs make it difficult to understand which parts of the graph features contribute more strongly to the predictions.
In this study, a flexible model agnostic explanation method is proposed to detect significant structures in graphs.
arXiv Detail & Related papers (2023-02-04T09:46:47Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - EIGNN: Efficient Infinite-Depth Graph Neural Networks [51.97361378423152]
Graph neural networks (GNNs) are widely used for modelling graph-structured data in numerous applications.
Motivated by this limitation, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN)
We show that EIGNN has a better ability to capture long-range dependencies than recent baselines, and consistently achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-02-22T08:16:58Z) - GRAND: Graph Neural Diffusion [15.00135729657076]
We present Graph Neural Diffusion (GRAND) that approaches deep learning on graphs as a continuous diffusion process.
In our model, the layer structure and topology correspond to the discretisation choices of temporal and spatial operators.
Key to the success of our models are stability with respect to perturbations in the data and this is addressed for both implicit and explicit discretisation schemes.
arXiv Detail & Related papers (2021-06-21T09:10:57Z) - Increase and Conquer: Training Graph Neural Networks on Growing Graphs [116.03137405192356]
We consider the problem of learning a graphon neural network (WNN) by training GNNs on graphs sampled Bernoulli from the graphon.
Inspired by these results, we propose an algorithm to learn GNNs on large-scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training.
arXiv Detail & Related papers (2021-06-07T15:05:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.