How Powerful is Implicit Denoising in Graph Neural Networks
- URL: http://arxiv.org/abs/2209.14514v1
- Date: Thu, 29 Sep 2022 02:19:39 GMT
- Title: How Powerful is Implicit Denoising in Graph Neural Networks
- Authors: Songtao Liu, Rex Ying, Hanze Dong, Lu Lin, Jinghui Chen, Dinghao Wu
- Abstract summary: We conduct a comprehensive theoretical study and analyze when and why the implicit denoising happens in GNNs.
Our theoretical analysis suggests that the implicit denoising largely depends on the connectivity, the graph size, and GNN architectures.
We derive a robust graph convolution, where the smoothness of the node representations and the implicit denoising effect can be enhanced.
- Score: 33.01155523195073
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Neural Networks (GNNs), which aggregate features from neighbors, are
widely used for graph-structured data processing due to their powerful
representation learning capabilities. It is generally believed that GNNs can
implicitly remove the non-predictive noises. However, the analysis of implicit
denoising effect in graph neural networks remains open. In this work, we
conduct a comprehensive theoretical study and analyze when and why the implicit
denoising happens in GNNs. Specifically, we study the convergence properties of
noise matrix. Our theoretical analysis suggests that the implicit denoising
largely depends on the connectivity, the graph size, and GNN architectures.
Moreover, we formally define and propose the adversarial graph signal denoising
(AGSD) problem by extending graph signal denoising problem. By solving such a
problem, we derive a robust graph convolution, where the smoothness of the node
representations and the implicit denoising effect can be enhanced. Extensive
empirical evaluations verify our theoretical analyses and the effectiveness of
our proposed model.
Related papers
- You Can't Ignore Either: Unifying Structure and Feature Denoising for Robust Graph Learning [34.52299775051481]
We develop a unified graph denoising (UGD) framework to unravel the deadlock between structure and feature denoising.
Specifically, a high-order neighborhood proximity evaluation method is proposed to recognize noisy edges.
We also propose to refine noisy features with reconstruction based on a graph auto-encoder.
arXiv Detail & Related papers (2024-08-01T16:43:55Z) - Robust Graph Neural Network based on Graph Denoising [10.564653734218755]
Graph Neural Networks (GNNs) have emerged as a notorious alternative to address learning problems dealing with non-Euclidean datasets.
This work proposes a robust implementation of GNNs that explicitly accounts for the presence of perturbations in the observed topology.
arXiv Detail & Related papers (2023-12-11T17:43:57Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - A Non-Asymptotic Analysis of Oversmoothing in Graph Neural Networks [33.35609077417775]
We characterize the mechanism behind the phenomenon via a non-asymptotic analysis.
We show that oversmoothing happens once the mixing effect starts to dominate the denoising effect.
Our results suggest that while PPR mitigates oversmoothing at deeper layers, PPR-based architectures still achieve their best performance at a shallow depth.
arXiv Detail & Related papers (2022-12-21T00:33:59Z) - Oversquashing in GNNs through the lens of information contraction and
graph expansion [6.8222473597904845]
We present a framework for analyzing oversquashing based on information contraction.
We propose a graph rewiring algorithm aimed at alleviating oversquashing.
arXiv Detail & Related papers (2022-08-06T08:44:39Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Stability of Graph Convolutional Neural Networks to Stochastic
Perturbations [122.12962842842349]
Graph convolutional neural networks (GCNNs) are nonlinear processing tools to learn representations from network data.
Current analysis considers deterministic perturbations but fails to provide relevant insights when topological changes are random.
This paper investigates the stability of GCNNs to perturbed graph perturbations induced by link losses.
arXiv Detail & Related papers (2021-06-19T16:25:28Z) - Graph and graphon neural network stability [122.06927400759021]
Graph networks (GNNs) are learning architectures that rely on knowledge of the graph structure to generate meaningful representations of network data.
We analyze GNN stability using kernel objects called graphons.
arXiv Detail & Related papers (2020-10-23T16:55:56Z) - Unrolling of Deep Graph Total Variation for Image Denoising [106.93258903150702]
In this paper, we combine classical graph signal filtering with deep feature learning into a competitive hybrid design.
We employ interpretable analytical low-pass graph filters and employ 80% fewer network parameters than state-of-the-art DL denoising scheme DnCNN.
arXiv Detail & Related papers (2020-10-21T20:04:22Z) - Understanding Graph Neural Networks from Graph Signal Denoising
Perspectives [27.148827305359436]
Graph neural networks (GNNs) have attracted much attention because of their excellent performance on tasks such as node classification.
This paper aims to provide a theoretical framework to understand GNNs, specifically, spectral graph convolutional networks and graph attention networks.
arXiv Detail & Related papers (2020-06-08T07:10:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.