Learning Node Representations from Noisy Graph Structures
- URL: http://arxiv.org/abs/2012.02434v1
- Date: Fri, 4 Dec 2020 07:18:39 GMT
- Title: Learning Node Representations from Noisy Graph Structures
- Authors: Junshan Wang, Ziyao Li, Qingqing Long, Weiyu Zhang, Guojie Song, Chuan
Shi
- Abstract summary: Noises prevail in real-world networks, which compromise networks to a large extent.
We propose a novel framework to learn noise-free node representations and eliminate noises simultaneously.
- Score: 38.32421350245066
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning low-dimensional representations on graphs has proved to be effective
in various downstream tasks. However, noises prevail in real-world networks,
which compromise networks to a large extent in that edges in networks propagate
noises through the whole network instead of only the node itself. While
existing methods tend to focus on preserving structural properties, the
robustness of the learned representations against noises is generally ignored.
In this paper, we propose a novel framework to learn noise-free node
representations and eliminate noises simultaneously. Since noises are often
unknown on real graphs, we design two generators, namely a graph generator and
a noise generator, to identify normal structures and noises in an unsupervised
setting. On the one hand, the graph generator serves as a unified scheme to
incorporate any useful graph prior knowledge to generate normal structures. We
illustrate the generative process with community structures and power-law
degree distributions as examples. On the other hand, the noise generator
generates graph noises not only satisfying some fundamental properties but also
in an adaptive way. Thus, real noises with arbitrary distributions can be
handled successfully. Finally, in order to eliminate noises and obtain
noise-free node representations, two generators need to be optimized jointly,
and through maximum likelihood estimation, we equivalently convert the model
into imposing different regularization constraints on the true graph and noises
respectively. Our model is evaluated on both real-world and synthetic data. It
outperforms other strong baselines for node classification and graph
reconstruction tasks, demonstrating its ability to eliminate graph noises.
Related papers
- You Can't Ignore Either: Unifying Structure and Feature Denoising for Robust Graph Learning [34.52299775051481]
We develop a unified graph denoising (UGD) framework to unravel the deadlock between structure and feature denoising.
Specifically, a high-order neighborhood proximity evaluation method is proposed to recognize noisy edges.
We also propose to refine noisy features with reconstruction based on a graph auto-encoder.
arXiv Detail & Related papers (2024-08-01T16:43:55Z) - DEGNN: Dual Experts Graph Neural Network Handling Both Edge and Node Feature Noise [5.048629544493508]
Graph Neural Networks (GNNs) have achieved notable success in various applications over graph data.
Recent research has revealed that real-world graphs often contain noise, and GNNs are susceptible to noise in the graph.
We present DEGNN, a novel GNN model designed to adeptly mitigate noise in both edges and node features.
arXiv Detail & Related papers (2024-04-14T10:04:44Z) - Random Geometric Graph Alignment with Graph Neural Networks [8.08963638000146]
We show that a graph neural network can recover an unknown one-to-one mapping between the vertices of two graphs.
We also prove that our conditions on the noise level are tight up to logarithmic factors.
We demonstrate that when the noise level is at least constant this direct matching fails to have perfect recovery while the graph neural network can tolerate noise level growing as fast as a power of the size of the graph.
arXiv Detail & Related papers (2024-02-12T00:18:25Z) - RDGSL: Dynamic Graph Representation Learning with Structure Learning [23.00398150548281]
Temporal Graph Networks (TGNs) have shown remarkable performance in learning representation for continuous-time dynamic graphs.
However, real-world dynamic graphs typically contain diverse and intricate noise.
Noise can significantly degrade the quality of representation generation, impeding the effectiveness of TGNs in downstream tasks.
arXiv Detail & Related papers (2023-09-05T08:03:59Z) - Local Graph-homomorphic Processing for Privatized Distributed Systems [57.14673504239551]
We show that the added noise does not affect the performance of the learned model.
This is a significant improvement to previous works on differential privacy for distributed algorithms.
arXiv Detail & Related papers (2022-10-26T10:00:14Z) - Learning to Generate Realistic Noisy Images via Pixel-level Noise-aware
Adversarial Training [50.018580462619425]
We propose a novel framework, namely Pixel-level Noise-aware Generative Adrial Network (PNGAN)
PNGAN employs a pre-trained real denoiser to map the fake and real noisy images into a nearly noise-free solution space.
For better noise fitting, we present an efficient architecture Simple Multi-versa-scale Network (SMNet) as the generator.
arXiv Detail & Related papers (2022-04-06T14:09:02Z) - C2N: Practical Generative Noise Modeling for Real-World Denoising [53.96391787869974]
We introduce a Clean-to-Noisy image generation framework, namely C2N, to imitate complex real-world noise without using paired examples.
We construct the noise generator in C2N accordingly with each component of real-world noise characteristics to express a wide range of noise accurately.
arXiv Detail & Related papers (2022-02-19T05:53:46Z) - Graph Denoising with Framelet Regularizer [25.542429117462547]
This paper tailors regularizers for graph data in terms of both feature and structure noises.
Our model achieves significantly better performance compared with popular graph convolutions even when the graph is heavily contaminated.
arXiv Detail & Related papers (2021-11-05T05:17:23Z) - Neighbor2Neighbor: Self-Supervised Denoising from Single Noisy Images [98.82804259905478]
We present Neighbor2Neighbor to train an effective image denoising model with only noisy images.
In detail, input and target used to train a network are images sub-sampled from the same noisy image.
A denoising network is trained on sub-sampled training pairs generated in the first stage, with a proposed regularizer as additional loss for better performance.
arXiv Detail & Related papers (2021-01-08T02:03:25Z) - Reconstructing the Noise Manifold for Image Denoising [56.562855317536396]
We introduce the idea of a cGAN which explicitly leverages structure in the image noise space.
By learning directly a low dimensional manifold of the image noise, the generator promotes the removal from the noisy image only that information which spans this manifold.
Based on our experiments, our model substantially outperforms existing state-of-the-art architectures.
arXiv Detail & Related papers (2020-02-11T00:31:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.