Graph Structural Attack by Spectral Distance
- URL: http://arxiv.org/abs/2111.00684v2
- Date: Wed, 3 Nov 2021 14:54:33 GMT
- Title: Graph Structural Attack by Spectral Distance
- Authors: Lu Lin, Ethan Blaser and Hongning Wang
- Abstract summary: Graph Convolutional Networks (GCNs) have fueled a surge of interest due to their superior performance on graph learning tasks.
In this paper, an effective graph structural attack is investigated to disrupt graph spectral filters in the Fourier domain.
- Score: 35.998704625736394
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Convolutional Networks (GCNs) have fueled a surge of interest due to
their superior performance on graph learning tasks, but are also shown
vulnerability to adversarial attacks. In this paper, an effective graph
structural attack is investigated to disrupt graph spectral filters in the
Fourier domain. We define the spectral distance based on the eigenvalues of
graph Laplacian to measure the disruption of spectral filters. We then generate
edge perturbations by simultaneously maximizing a task-specific attack
objective and the proposed spectral distance. The experiments demonstrate
remarkable effectiveness of the proposed attack in the white-box setting at
both training and test time. Our qualitative analysis shows the connection
between the attack behavior and the imposed changes on the spectral
distribution, which provides empirical evidence that maximizing spectral
distance is an effective manner to change the structural property of graphs in
the spatial domain and perturb the frequency components in the Fourier domain.
Related papers
- Spectral-Aware Augmentation for Enhanced Graph Representation Learning [10.36458924914831]
We present GASSER, a model that applies tailored perturbations to specific frequencies of graph structures in the spectral domain.
Through extensive experimentation and theoretical analysis, we demonstrate that the augmentation views generated by GASSER are adaptive, controllable, and intuitively aligned with the homophily ratios and spectrum of graph structures.
arXiv Detail & Related papers (2023-10-20T22:39:07Z) - HoloNets: Spectral Convolutions do extend to Directed Graphs [59.851175771106625]
Conventional wisdom dictates that spectral convolutional networks may only be deployed on undirected graphs.
Here we show this traditional reliance on the graph Fourier transform to be superfluous.
We provide a frequency-response interpretation of newly developed filters, investigate the influence of the basis used to express filters and discuss the interplay with characteristic operators on which networks are based.
arXiv Detail & Related papers (2023-10-03T17:42:09Z) - Specformer: Spectral Graph Neural Networks Meet Transformers [51.644312964537356]
Spectral graph neural networks (GNNs) learn graph representations via spectral-domain graph convolutions.
We introduce Specformer, which effectively encodes the set of all eigenvalues and performs self-attention in the spectral domain.
By stacking multiple Specformer layers, one can build a powerful spectral GNN.
arXiv Detail & Related papers (2023-03-02T07:36:23Z) - Spectral Feature Augmentation for Graph Contrastive Learning and Beyond [64.78221638149276]
We present a novel spectral feature argumentation for contrastive learning on graphs (and images)
For each data view, we estimate a low-rank approximation per feature map and subtract that approximation from the map to obtain its complement.
This is achieved by the proposed herein incomplete power iteration, a non-standard power regime which enjoys two valuable byproducts (under mere one or two iterations)
Experiments on graph/image datasets show that our spectral feature augmentation outperforms baselines.
arXiv Detail & Related papers (2022-12-02T08:48:11Z) - Spectral Augmentation for Self-Supervised Learning on Graphs [43.19199994575821]
Graph contrastive learning (GCL) aims to learn representations via instance discrimination.
It relies on graph augmentation to reflect invariant patterns that are robust to small perturbations.
Recent studies mainly perform topology augmentations in a uniformly random manner in the spatial domain.
We develop spectral augmentation which guides topology augmentations by maximizing the spectral change.
arXiv Detail & Related papers (2022-10-02T22:20:07Z) - Point Cloud Attacks in Graph Spectral Domain: When 3D Geometry Meets
Graph Signal Processing [30.86044518259855]
Point cloud learning models have been shown to be vulnerable to adversarial attacks.
We propose the graph spectral domain attack, aiming to perturb graph transform coefficients in the spectral domain that corresponds to varying geometric structure.
Experimental results demonstrate the effectiveness of the proposed attack in terms of both the imperceptibility and attack success rates.
arXiv Detail & Related papers (2022-07-27T07:02:36Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - An Experimental Study of the Transferability of Spectral Graph Networks [5.736353542430439]
Spectral graph convolutional networks are generalizations of standard convolutional networks for graph-structured data using the Laplacian operator.
Recent works have proved the stability of spectral filters under graph benchmarks.
arXiv Detail & Related papers (2020-12-18T14:15:07Z) - Bridging the Gap Between Spectral and Spatial Domains in Graph Neural
Networks [8.563354084119062]
We show some equivalence of the graph convolution process regardless it is designed in the spatial or the spectral domain.
The proposed framework is used to design new convolutions in spectral domain with a custom frequency profile while applying them in the spatial domain.
arXiv Detail & Related papers (2020-03-26T01:49:24Z) - Spectral Graph Attention Network with Fast Eigen-approximation [103.93113062682633]
Spectral Graph Attention Network (SpGAT) learns representations for different frequency components regarding weighted filters and graph wavelets bases.
Fast approximation variant SpGAT-Cheby is proposed to reduce the computational cost brought by the eigen-decomposition.
We thoroughly evaluate the performance of SpGAT and SpGAT-Cheby in semi-supervised node classification tasks.
arXiv Detail & Related papers (2020-03-16T21:49:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.