Understanding Graph Neural Networks from Graph Signal Denoising
Perspectives
- URL: http://arxiv.org/abs/2006.04386v1
- Date: Mon, 8 Jun 2020 07:10:39 GMT
- Title: Understanding Graph Neural Networks from Graph Signal Denoising
Perspectives
- Authors: Guoji Fu and Yifan Hou and Jian Zhang and Kaili Ma and Barakeel Fanseu
Kamhoua and James Cheng
- Abstract summary: Graph neural networks (GNNs) have attracted much attention because of their excellent performance on tasks such as node classification.
This paper aims to provide a theoretical framework to understand GNNs, specifically, spectral graph convolutional networks and graph attention networks.
- Score: 27.148827305359436
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) have attracted much attention because of their
excellent performance on tasks such as node classification. However, there is
inadequate understanding on how and why GNNs work, especially for node
representation learning. This paper aims to provide a theoretical framework to
understand GNNs, specifically, spectral graph convolutional networks and graph
attention networks, from graph signal denoising perspectives. Our framework
shows that GNNs are implicitly solving graph signal denoising problems:
spectral graph convolutions work as denoising node features, while graph
attentions work as denoising edge weights. We also show that a linear
self-attention mechanism is able to compete with the state-of-the-art graph
attention methods. Our theoretical results further lead to two new models,
GSDN-F and GSDN-EF, which work effectively for graphs with noisy node features
and/or noisy edges. We validate our theoretical findings and also the
effectiveness of our new models by experiments on benchmark datasets. The
source code is available at \url{https://github.com/fuguoji/GSDN}.
Related papers
- Spectral Greedy Coresets for Graph Neural Networks [61.24300262316091]
The ubiquity of large-scale graphs in node-classification tasks hinders the real-world applications of Graph Neural Networks (GNNs)
This paper studies graph coresets for GNNs and avoids the interdependence issue by selecting ego-graphs based on their spectral embeddings.
Our spectral greedy graph coreset (SGGC) scales to graphs with millions of nodes, obviates the need for model pre-training, and applies to low-homophily graphs.
arXiv Detail & Related papers (2024-05-27T17:52:12Z) - Robust Graph Neural Network based on Graph Denoising [10.564653734218755]
Graph Neural Networks (GNNs) have emerged as a notorious alternative to address learning problems dealing with non-Euclidean datasets.
This work proposes a robust implementation of GNNs that explicitly accounts for the presence of perturbations in the observed topology.
arXiv Detail & Related papers (2023-12-11T17:43:57Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Graph Neural Tangent Kernel: Convergence on Large Graphs [7.624781434274796]
Graph neural networks (GNNs) achieve remarkable performance in graph machine learning tasks.
We investigate the training dynamics of large-graph GNNs using graph neural kernels (GNTKs) and graphons.
arXiv Detail & Related papers (2023-01-25T19:52:58Z) - A Spectral Analysis of Graph Neural Networks on Dense and Sparse Graphs [13.954735096637298]
We analyze how sparsity affects the graph spectra, and thus the performance of graph neural networks (GNNs) in node classification on dense and sparse graphs.
We show that GNNs can outperform spectral methods on sparse graphs, and illustrate these results with numerical examples on both synthetic and real graphs.
arXiv Detail & Related papers (2022-11-06T22:38:13Z) - FoSR: First-order spectral rewiring for addressing oversquashing in GNNs [0.0]
Graph neural networks (GNNs) are able to leverage the structure of graph data by passing messages along the edges of the graph.
We propose a computationally efficient algorithm that prevents oversquashing by systematically adding edges to the graph.
We find experimentally that our algorithm outperforms existing graph rewiring methods in several graph classification tasks.
arXiv Detail & Related papers (2022-10-21T07:58:03Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Towards Robust Graph Neural Networks for Noisy Graphs with Sparse Labels [24.25945793671978]
We study a novel problem of developing robust GNNs on noisy graphs with limited labeled nodes.
Our analysis shows that both the noisy edges and limited labeled nodes could harm the message-passing mechanism of GNNs.
We propose a novel framework which adopts the noisy edges as supervision to learn a denoised and dense graph.
arXiv Detail & Related papers (2022-01-01T19:00:26Z) - Increase and Conquer: Training Graph Neural Networks on Growing Graphs [116.03137405192356]
We consider the problem of learning a graphon neural network (WNN) by training GNNs on graphs sampled Bernoulli from the graphon.
Inspired by these results, we propose an algorithm to learn GNNs on large-scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training.
arXiv Detail & Related papers (2021-06-07T15:05:59Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - XGNN: Towards Model-Level Explanations of Graph Neural Networks [113.51160387804484]
Graphs neural networks (GNNs) learn node features by aggregating and combining neighbor information.
GNNs are mostly treated as black-boxes and lack human intelligible explanations.
We propose a novel approach, known as XGNN, to interpret GNNs at the model-level.
arXiv Detail & Related papers (2020-06-03T23:52:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.