You Can't Ignore Either: Unifying Structure and Feature Denoising for Robust Graph Learning
- URL: http://arxiv.org/abs/2408.00700v1
- Date: Thu, 1 Aug 2024 16:43:55 GMT
- Title: You Can't Ignore Either: Unifying Structure and Feature Denoising for Robust Graph Learning
- Authors: Tianmeng Yang, Jiahao Meng, Min Zhou, Yaming Yang, Yujing Wang, Xiangtai Li, Yunhai Tong,
- Abstract summary: We develop a unified graph denoising (UGD) framework to unravel the deadlock between structure and feature denoising.
Specifically, a high-order neighborhood proximity evaluation method is proposed to recognize noisy edges.
We also propose to refine noisy features with reconstruction based on a graph auto-encoder.
- Score: 34.52299775051481
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent research on the robustness of Graph Neural Networks (GNNs) under noises or attacks has attracted great attention due to its importance in real-world applications. Most previous methods explore a single noise source, recovering corrupt node embedding by reliable structures bias or developing structure learning with reliable node features. However, the noises and attacks may come from both structures and features in graphs, making the graph denoising a dilemma and challenging problem. In this paper, we develop a unified graph denoising (UGD) framework to unravel the deadlock between structure and feature denoising. Specifically, a high-order neighborhood proximity evaluation method is proposed to recognize noisy edges, considering features may be perturbed simultaneously. Moreover, we propose to refine noisy features with reconstruction based on a graph auto-encoder. An iterative updating algorithm is further designed to optimize the framework and acquire a clean graph, thus enabling robust graph learning for downstream tasks. Our UGD framework is self-supervised and can be easily implemented as a plug-and-play module. We carry out extensive experiments, which proves the effectiveness and advantages of our method. Code is avalaible at https://github.com/YoungTimmy/UGD.
Related papers
- DEGNN: Dual Experts Graph Neural Network Handling Both Edge and Node Feature Noise [5.048629544493508]
Graph Neural Networks (GNNs) have achieved notable success in various applications over graph data.
Recent research has revealed that real-world graphs often contain noise, and GNNs are susceptible to noise in the graph.
We present DEGNN, a novel GNN model designed to adeptly mitigate noise in both edges and node features.
arXiv Detail & Related papers (2024-04-14T10:04:44Z) - GraphEdit: Large Language Models for Graph Structure Learning [62.618818029177355]
Graph Structure Learning (GSL) focuses on capturing intrinsic dependencies and interactions among nodes in graph-structured data.
Existing GSL methods heavily depend on explicit graph structural information as supervision signals.
We propose GraphEdit, an approach that leverages large language models (LLMs) to learn complex node relationships in graph-structured data.
arXiv Detail & Related papers (2024-02-23T08:29:42Z) - Combating Bilateral Edge Noise for Robust Link Prediction [56.43882298843564]
We propose an information-theory-guided principle, Robust Graph Information Bottleneck (RGIB), to extract reliable supervision signals and avoid representation collapse.
Two instantiations, RGIB-SSL and RGIB-REP, are explored to leverage the merits of different methodologies.
Experiments on six datasets and three GNNs with diverse noisy scenarios verify the effectiveness of our RGIB instantiations.
arXiv Detail & Related papers (2023-11-02T12:47:49Z) - SE-GSL: A General and Effective Graph Structure Learning Framework
through Structural Entropy Optimization [67.28453445927825]
Graph Neural Networks (GNNs) are de facto solutions to structural data learning.
Existing graph structure learning (GSL) frameworks still lack robustness and interpretability.
This paper proposes a general GSL framework, SE-GSL, through structural entropy and the graph hierarchy abstracted in the encoding tree.
arXiv Detail & Related papers (2023-03-17T05:20:24Z) - Graph Signal Sampling for Inductive One-Bit Matrix Completion: a
Closed-form Solution [112.3443939502313]
We propose a unified graph signal sampling framework which enjoys the benefits of graph signal analysis and processing.
The key idea is to transform each user's ratings on the items to a function (signal) on the vertices of an item-item graph.
For the online setting, we develop a Bayesian extension, i.e., BGS-IMC which considers continuous random Gaussian noise in the graph Fourier domain.
arXiv Detail & Related papers (2023-02-08T08:17:43Z) - How Powerful is Implicit Denoising in Graph Neural Networks [33.01155523195073]
We conduct a comprehensive theoretical study and analyze when and why the implicit denoising happens in GNNs.
Our theoretical analysis suggests that the implicit denoising largely depends on the connectivity, the graph size, and GNN architectures.
We derive a robust graph convolution, where the smoothness of the node representations and the implicit denoising effect can be enhanced.
arXiv Detail & Related papers (2022-09-29T02:19:39Z) - Reliable Representations Make A Stronger Defender: Unsupervised
Structure Refinement for Robust GNN [36.045702771828736]
Graph Neural Networks (GNNs) have been successful on flourish tasks over graph data.
Recent studies have shown that attackers can catastrophically degrade the performance of GNNs by maliciously modifying the graph structure.
We propose an unsupervised pipeline, named STABLE, to optimize the graph structure.
arXiv Detail & Related papers (2022-06-30T10:02:32Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - Local Augmentation for Graph Neural Networks [78.48812244668017]
We introduce the local augmentation, which enhances node features by its local subgraph structures.
Based on the local augmentation, we further design a novel framework: LA-GNN, which can apply to any GNN models in a plug-and-play manner.
arXiv Detail & Related papers (2021-09-08T18:10:08Z) - Learning Node Representations from Noisy Graph Structures [38.32421350245066]
Noises prevail in real-world networks, which compromise networks to a large extent.
We propose a novel framework to learn noise-free node representations and eliminate noises simultaneously.
arXiv Detail & Related papers (2020-12-04T07:18:39Z) - Understanding Graph Neural Networks from Graph Signal Denoising
Perspectives [27.148827305359436]
Graph neural networks (GNNs) have attracted much attention because of their excellent performance on tasks such as node classification.
This paper aims to provide a theoretical framework to understand GNNs, specifically, spectral graph convolutional networks and graph attention networks.
arXiv Detail & Related papers (2020-06-08T07:10:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.