Training Robust Graph Neural Networks by Modeling Noise Dependencies
- URL: http://arxiv.org/abs/2502.19670v1
- Date: Thu, 27 Feb 2025 01:30:13 GMT
- Title: Training Robust Graph Neural Networks by Modeling Noise Dependencies
- Authors: Yeonjun In, Kanghoon Yoon, Sukwon Yun, Kibum Kim, Sungchul Kim, Chanyoung Park,
- Abstract summary: In real-world applications, node features in graphs often contain noise from various sources, leading to significant performance degradation.<n>We introduce a more realistic noise scenario, dependency-aware noise on graphs (DANG), where noise in node features create a chain of noise dependencies that propagates to the graph structure and node labels.<n>We propose a novel robust GNN, DA-GNN, which captures the causal relationships among variables in the data generating process (DGP) of DANG using variational inference.
- Score: 28.1151026795484
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In real-world applications, node features in graphs often contain noise from various sources, leading to significant performance degradation in GNNs. Although several methods have been developed to enhance robustness, they rely on the unrealistic assumption that noise in node features is independent of the graph structure and node labels, thereby limiting their applicability. To this end, we introduce a more realistic noise scenario, dependency-aware noise on graphs (DANG), where noise in node features create a chain of noise dependencies that propagates to the graph structure and node labels. We propose a novel robust GNN, DA-GNN, which captures the causal relationships among variables in the data generating process (DGP) of DANG using variational inference. In addition, we present new benchmark datasets that simulate DANG in real-world applications, enabling more practical research on robust GNNs. Extensive experiments demonstrate that DA-GNN consistently outperforms existing baselines across various noise scenarios, including both DANG and conventional noise models commonly considered in this field.
Related papers
- Using Random Noise Equivariantly to Boost Graph Neural Networks Universally [27.542173012315413]
Graph Neural Networks (GNNs) have explored the potential of random noise as an input feature to enhance expressivity across diverse tasks.<n>This paper lays down a theoretical framework that elucidates the increased sample complexity when random noise into GNNs without careful design.<n>We propose Equivariant Noise GNN (ENGNN), a novel architecture that harnesses the symmetrical properties of noise to sample complexity and bolster generalization.
arXiv Detail & Related papers (2025-02-04T16:54:28Z) - Noise-Resilient Unsupervised Graph Representation Learning via Multi-Hop Feature Quality Estimation [53.91958614666386]
Unsupervised graph representation learning (UGRL) based on graph neural networks (GNNs)
We propose a novel UGRL method based on Multi-hop feature Quality Estimation (MQE)
arXiv Detail & Related papers (2024-07-29T12:24:28Z) - DEGNN: Dual Experts Graph Neural Network Handling Both Edge and Node Feature Noise [5.048629544493508]
Graph Neural Networks (GNNs) have achieved notable success in various applications over graph data.
Recent research has revealed that real-world graphs often contain noise, and GNNs are susceptible to noise in the graph.
We present DEGNN, a novel GNN model designed to adeptly mitigate noise in both edges and node features.
arXiv Detail & Related papers (2024-04-14T10:04:44Z) - Feature Noise Boosts DNN Generalization under Label Noise [65.36889005555669]
The presence of label noise in the training data has a profound impact on the generalization of deep neural networks (DNNs)
In this study, we introduce and theoretically demonstrate a simple feature noise method, which directly adds noise to the features of training data.
arXiv Detail & Related papers (2023-08-03T08:31:31Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Deep Graph Neural Networks via Posteriori-Sampling-based Node-Adaptive Residual Module [65.81781176362848]
Graph Neural Networks (GNNs) can learn from graph-structured data through neighborhood information aggregation.
As the number of layers increases, node representations become indistinguishable, which is known as over-smoothing.
We propose a textbfPosterior-Sampling-based, Node-distinguish Residual module (PSNR).
arXiv Detail & Related papers (2023-05-09T12:03:42Z) - How Powerful is Implicit Denoising in Graph Neural Networks [33.01155523195073]
We conduct a comprehensive theoretical study and analyze when and why the implicit denoising happens in GNNs.
Our theoretical analysis suggests that the implicit denoising largely depends on the connectivity, the graph size, and GNN architectures.
We derive a robust graph convolution, where the smoothness of the node representations and the implicit denoising effect can be enhanced.
arXiv Detail & Related papers (2022-09-29T02:19:39Z) - EIGNN: Efficient Infinite-Depth Graph Neural Networks [51.97361378423152]
Graph neural networks (GNNs) are widely used for modelling graph-structured data in numerous applications.
Motivated by this limitation, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN)
We show that EIGNN has a better ability to capture long-range dependencies than recent baselines, and consistently achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-02-22T08:16:58Z) - A Comparative Study on Robust Graph Neural Networks to Structural Noises [12.44737954516764]
Graph neural networks (GNNs) learn node representations by passing and aggregating messages between neighboring nodes.
GNNs could be vulnerable to structural noise because of the message passing mechanism where noise may be propagated through the entire graph.
We conduct a comprehensive and systematical comparative study on different types of robust GNNs under consistent structural noise settings.
arXiv Detail & Related papers (2021-12-11T21:01:29Z) - Local Augmentation for Graph Neural Networks [78.48812244668017]
We introduce the local augmentation, which enhances node features by its local subgraph structures.
Based on the local augmentation, we further design a novel framework: LA-GNN, which can apply to any GNN models in a plug-and-play manner.
arXiv Detail & Related papers (2021-09-08T18:10:08Z) - Stochastic Aggregation in Graph Neural Networks [9.551282469099887]
Graph neural networks (GNNs) manifest pathologies including over-smoothing and limited power discriminating.
We present a unifying framework for aggregation (STAG) in GNNs, where noise is (adaptively) injected into the aggregation process from the neighborhood to form node embeddings.
arXiv Detail & Related papers (2021-02-25T02:52:03Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.