Unifying Label-inputted Graph Neural Networks with Deep Equilibrium
Models
- URL: http://arxiv.org/abs/2211.10629v2
- Date: Wed, 31 May 2023 02:04:23 GMT
- Title: Unifying Label-inputted Graph Neural Networks with Deep Equilibrium
Models
- Authors: Yi Luo, Guiduo Duan, Guangchun Luo, Aiguo Chen
- Abstract summary: This work unifies the two Graph Neural Networks (GNNs) by interpreting LGNN in the theory of Implicit GNN (IGNN)
IGNN exploits information in the entire graph to capture long-range dependencies, but with its network constrained to guarantee the existence of the equilibrium.
In this work, implicit differentiation of IGNN is introduced to differentiate its infinite-range label propagation constant memory, making the propagation both distant and adaptive.
- Score: 12.71307159013144
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The success of Graph Neural Networks (GNN) in learning on non-Euclidean data
arouses many subtopics, such as Label-inputted GNN (LGNN) and Implicit GNN
(IGNN). LGNN, explicitly inputting supervising information (a.k.a. labels) in
GNN, integrates label propagation to achieve superior performance, but with the
dilemma between its propagating distance and adaptiveness. IGNN, outputting an
equilibrium point by iterating its network infinite times, exploits information
in the entire graph to capture long-range dependencies, but with its network
constrained to guarantee the existence of the equilibrium. This work unifies
the two subdomains by interpreting LGNN in the theory of IGNN and reducing
prevailing LGNNs to the form of IGNN. The unification facilitates the exchange
between the two subdomains and inspires more studies. Specifically, implicit
differentiation of IGNN is introduced to LGNN to differentiate its
infinite-range label propagation with constant memory, making the propagation
both distant and adaptive. Besides, the masked label strategy of LGNN is proven
able to guarantee the well-posedness of IGNN in a network-agnostic manner,
granting its network more complex and thus more expressive. Combining the
advantages of LGNN and IGNN, Label-inputted Implicit GNN (LI-GNN) is proposed.
It can be widely applied to any specific GNN to boost its performance. Node
classification experiments on two synthesized and six real-world datasets
demonstrate its effectiveness. Code is available at
https://github.com/cf020031308/LI-GNN
Related papers
- Spatio-Spectral Graph Neural Networks [50.277959544420455]
We propose Spatio-Spectral Graph Networks (S$2$GNNs)
S$2$GNNs combine spatially and spectrally parametrized graph filters.
We show that S$2$GNNs vanquish over-squashing and yield strictly tighter approximation-theoretic error bounds than MPGNNs.
arXiv Detail & Related papers (2024-05-29T14:28:08Z) - Label Deconvolution for Node Representation Learning on Large-scale
Attributed Graphs against Learning Bias [75.44877675117749]
We propose an efficient label regularization technique, namely Label Deconvolution (LD), to alleviate the learning bias by a novel and highly scalable approximation to the inverse mapping of GNNs.
Experiments demonstrate LD significantly outperforms state-of-the-art methods on Open Graph datasets Benchmark.
arXiv Detail & Related papers (2023-09-26T13:09:43Z) - GNN-Ensemble: Towards Random Decision Graph Neural Networks [3.7620848582312405]
Graph Neural Networks (GNNs) have enjoyed wide spread applications in graph-structured data.
GNNs are required to learn latent patterns from a limited amount of training data to perform inferences on a vast amount of test data.
In this paper, we push one step forward on the ensemble learning of GNNs with improved accuracy, robustness, and adversarial attacks.
arXiv Detail & Related papers (2023-03-20T18:24:01Z) - Robust Graph Neural Networks using Weighted Graph Laplacian [1.8292714902548342]
Graph neural network (GNN) is vulnerable to noise and adversarial attacks in input data.
We propose a generic framework for robustifying GNN known as Weighted Laplacian GNN (RWL-GNN)
arXiv Detail & Related papers (2022-08-03T05:36:35Z) - Hybrid Graph Neural Networks for Few-Shot Learning [85.93495480949079]
Graph neural networks (GNNs) have been used to tackle the few-shot learning problem.
Under the inductive setting, existing GNN based methods are less competitive.
We propose a novel hybrid GNN model consisting of two GNNs, an instance GNN and a prototype GNN.
arXiv Detail & Related papers (2021-12-13T10:20:15Z) - $p$-Laplacian Based Graph Neural Networks [27.747195341003263]
Graph networks (GNNs) have demonstrated superior performance for semi-supervised node classification on graphs.
We propose a new $p$-Laplacian based GNN model, termed as $p$GNN, whose message passing mechanism is derived from a discrete regularization framework.
We show that the new message passing mechanism works simultaneously as low-pass and high-pass filters, thus making $p$GNNs effective on both homophilic and heterophilic graphs.
arXiv Detail & Related papers (2021-11-14T13:16:28Z) - Is Heterophily A Real Nightmare For Graph Neural Networks To Do Node
Classification? [44.71818395535755]
Graph Neural Networks (GNNs) extend basic Neural Networks (NNs) by using the graph structures based on the inductive bias (homophily assumption)
Performance advantages of GNNs over graph-agnostic NNs seem not generally satisfactory.
Heterophily has been considered as a main cause and numerous works have been put forward to address it.
arXiv Detail & Related papers (2021-09-12T23:57:05Z) - AdaGNN: A multi-modal latent representation meta-learner for GNNs based
on AdaBoosting [0.38073142980733]
Graph Neural Networks (GNNs) focus on extracting intrinsic network features.
We propose boosting-based meta learner for GNNs.
AdaGNN performs exceptionally well for applications with rich and diverse node neighborhood information.
arXiv Detail & Related papers (2021-08-14T03:07:26Z) - Optimization of Graph Neural Networks: Implicit Acceleration by Skip
Connections and More Depth [57.10183643449905]
Graph Neural Networks (GNNs) have been studied from the lens of expressive power and generalization.
We study the dynamics of GNNs by studying deep skip optimization.
Our results provide first theoretical support for the success of GNNs.
arXiv Detail & Related papers (2021-05-10T17:59:01Z) - Identity-aware Graph Neural Networks [63.6952975763946]
We develop a class of message passing Graph Neural Networks (ID-GNNs) with greater expressive power than the 1-WL test.
ID-GNN extends existing GNN architectures by inductively considering nodes' identities during message passing.
We show that transforming existing GNNs to ID-GNNs yields on average 40% accuracy improvement on challenging node, edge, and graph property prediction tasks.
arXiv Detail & Related papers (2021-01-25T18:59:01Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.