Graph Adversarial Diffusion Convolution
- URL: http://arxiv.org/abs/2406.02059v1
- Date: Tue, 4 Jun 2024 07:43:04 GMT
- Title: Graph Adversarial Diffusion Convolution
- Authors: Songtao Liu, Jinghui Chen, Tianfan Fu, Lu Lin, Marinka Zitnik, Dinghao Wu,
- Abstract summary: This paper introduces a min-max optimization formulation for the Graph Signal Denoising (GSD) problem.
We derive a new variant of the Graph Diffusion Convolution architecture, called Graph Adversarial Diffusion Convolution (GADC)
- Score: 49.974206213411904
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper introduces a min-max optimization formulation for the Graph Signal Denoising (GSD) problem. In this formulation, we first maximize the second term of GSD by introducing perturbations to the graph structure based on Laplacian distance and then minimize the overall loss of the GSD. By solving the min-max optimization problem, we derive a new variant of the Graph Diffusion Convolution (GDC) architecture, called Graph Adversarial Diffusion Convolution (GADC). GADC differs from GDC by incorporating an additional term that enhances robustness against adversarial attacks on the graph structure and noise in node features. Moreover, GADC improves the performance of GDC on heterophilic graphs. Extensive experiments demonstrate the effectiveness of GADC across various datasets. Code is available at https://github.com/SongtaoLiu0823/GADC.
Related papers
- Matrix Completion with Graph Information: A Provable Nonconvex Optimization Approach [5.235925587710112]
We consider the problem of matrix completion with graphs as side information depicting the interrelations between variables.
We propose in this paper a graph regularized matrix completion algorithm called GSGD, based on preconditioned projected descent approach.
arXiv Detail & Related papers (2025-02-12T16:21:01Z) - Efficient Graph Condensation via Gaussian Process [11.304327316816561]
Graph condensation reduces the size of large graphs while preserving performance.
Existing methods often rely on bi-level optimization, requiring extensive GNN training and limiting their scalability.
This paper proposes Graph Condensation via Gaussian Process (GCGP), a novel and computationally efficient approach to graph condensation.
arXiv Detail & Related papers (2025-01-05T14:43:07Z) - Training-free Heterogeneous Graph Condensation via Data Selection [74.06562124781104]
We present the first Training underlineFree Heterogeneous Graph Condensation method, termed FreeHGC, facilitating both efficient and high-quality generation of heterogeneous condensed graphs.
Specifically, we reformulate the heterogeneous graph condensation problem as a data selection issue, offering a new perspective for assessing and condensing representative nodes and edges in the heterogeneous graphs.
arXiv Detail & Related papers (2024-12-20T02:49:32Z) - RobGC: Towards Robust Graph Condensation [61.259453496191696]
Graph neural networks (GNNs) have attracted widespread attention for their impressive capability of graph representation learning.
However, the increasing prevalence of large-scale graphs presents a significant challenge for GNN training due to their computational demands.
We propose graph condensation (GC) to generate an informative compact graph that enables efficient training of GNNs while retaining performance.
arXiv Detail & Related papers (2024-06-19T04:14:57Z) - Graph Condensation for Open-World Graph Learning [48.38802327346445]
Graph condensation (GC) has emerged as a promising acceleration solution for efficiently training graph neural networks (GNNs)
Existing GC methods are limited to aligning the condensed graph with merely the observed static graph distribution.
In real-world scenarios, however, graphs are dynamic and constantly evolving, with new nodes and edges being continually integrated.
We propose OpenGC, a robust GC framework that integrates structure-aware distribution shift to simulate evolving graph patterns.
arXiv Detail & Related papers (2024-05-27T09:47:09Z) - Graph Fourier MMD for Signals on Graphs [67.68356461123219]
We propose a novel distance between distributions and signals on graphs.
GFMMD is defined via an optimal witness function that is both smooth on the graph and maximizes difference in expectation.
We showcase it on graph benchmark datasets as well as on single cell RNA-sequencing data analysis.
arXiv Detail & Related papers (2023-06-05T00:01:17Z) - Gradient scarcity with Bilevel Optimization for Graph Learning [0.0]
gradient scarcity occurs when learning graphs by minimizing a loss on a subset of nodes causes edges between unlabelled nodes that are far from labelled ones to receive zero gradients.
We give a precise mathematical characterization of this phenomenon, and prove it also emerges in bilevel optimization.
To alleviate this issue, we study several solutions: we propose to resort to latent graph learning using a Graph-to-Graph model (G2G), graph regularization to impose a prior structure on the graph, or optimizing on a larger graph than the original one with a reduced diameter.
arXiv Detail & Related papers (2023-03-24T12:37:43Z) - Graph Signal Sampling for Inductive One-Bit Matrix Completion: a
Closed-form Solution [112.3443939502313]
We propose a unified graph signal sampling framework which enjoys the benefits of graph signal analysis and processing.
The key idea is to transform each user's ratings on the items to a function (signal) on the vertices of an item-item graph.
For the online setting, we develop a Bayesian extension, i.e., BGS-IMC which considers continuous random Gaussian noise in the graph Fourier domain.
arXiv Detail & Related papers (2023-02-08T08:17:43Z) - Evolving-Graph Gaussian Processes [20.065168755580558]
Existing approaches have focused on static structures, whereas many real graph data represent a dynamic structure, limiting the applications of GGPs.
We propose evolving-Graph Gaussian Processes (e-GGPs) to overcome this.
We demonstrate the benefits of e-GGPs over static graph Gaussian Process approaches.
arXiv Detail & Related papers (2021-06-29T07:16:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.