Spectral Feature Augmentation for Graph Contrastive Learning and Beyond
- URL: http://arxiv.org/abs/2212.01026v1
- Date: Fri, 2 Dec 2022 08:48:11 GMT
- Title: Spectral Feature Augmentation for Graph Contrastive Learning and Beyond
- Authors: Yifei Zhang, Hao Zhu, Zixing Song, Piotr Koniusz, Irwin King
- Abstract summary: We present a novel spectral feature argumentation for contrastive learning on graphs (and images)
For each data view, we estimate a low-rank approximation per feature map and subtract that approximation from the map to obtain its complement.
This is achieved by the proposed herein incomplete power iteration, a non-standard power regime which enjoys two valuable byproducts (under mere one or two iterations)
Experiments on graph/image datasets show that our spectral feature augmentation outperforms baselines.
- Score: 64.78221638149276
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Although augmentations (e.g., perturbation of graph edges, image crops) boost
the efficiency of Contrastive Learning (CL), feature level augmentation is
another plausible, complementary yet not well researched strategy. Thus, we
present a novel spectral feature argumentation for contrastive learning on
graphs (and images). To this end, for each data view, we estimate a low-rank
approximation per feature map and subtract that approximation from the map to
obtain its complement. This is achieved by the proposed herein incomplete power
iteration, a non-standard power iteration regime which enjoys two valuable
byproducts (under mere one or two iterations): (i) it partially balances
spectrum of the feature map, and (ii) it injects the noise into rebalanced
singular values of the feature map (spectral augmentation). For two views, we
align these rebalanced feature maps as such an improved alignment step can
focus more on less dominant singular values of matrices of both views, whereas
the spectral augmentation does not affect the spectral angle alignment
(singular vectors are not perturbed). We derive the analytical form for: (i)
the incomplete power iteration to capture its spectrum-balancing effect, and
(ii) the variance of singular values augmented implicitly by the noise. We also
show that the spectral augmentation improves the generalization bound.
Experiments on graph/image datasets show that our spectral feature augmentation
outperforms baselines, and is complementary with other augmentation strategies
and compatible with various contrastive losses.
Related papers
- Variance-Reducing Couplings for Random Features [57.73648780299374]
Random features (RFs) are a popular technique to scale up kernel methods in machine learning.
We find couplings to improve RFs defined on both Euclidean and discrete input spaces.
We reach surprising conclusions about the benefits and limitations of variance reduction as a paradigm.
arXiv Detail & Related papers (2024-05-26T12:25:09Z) - Graph Generation via Spectral Diffusion [51.60814773299899]
We present GRASP, a novel graph generative model based on 1) the spectral decomposition of the graph Laplacian matrix and 2) a diffusion process.
Specifically, we propose to use a denoising model to sample eigenvectors and eigenvalues from which we can reconstruct the graph Laplacian and adjacency matrix.
Our permutation invariant model can also handle node features by concatenating them to the eigenvectors of each node.
arXiv Detail & Related papers (2024-02-29T09:26:46Z) - Through the Dual-Prism: A Spectral Perspective on Graph Data
Augmentation for Graph Classification [71.36575018271405]
We introduce the Dual-Prism (DP) augmentation method, comprising DP-Noise and DP-Mask.
We find that keeping the low-frequency eigenvalues unchanged can preserve the critical properties at a large scale when generating augmented graphs.
arXiv Detail & Related papers (2024-01-18T12:58:53Z) - Hodge-Aware Contrastive Learning [101.56637264703058]
Simplicial complexes prove effective in modeling data with multiway dependencies.
We develop a contrastive self-supervised learning approach for processing simplicial data.
arXiv Detail & Related papers (2023-09-14T00:40:07Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - Spectral Augmentation for Self-Supervised Learning on Graphs [43.19199994575821]
Graph contrastive learning (GCL) aims to learn representations via instance discrimination.
It relies on graph augmentation to reflect invariant patterns that are robust to small perturbations.
Recent studies mainly perform topology augmentations in a uniformly random manner in the spatial domain.
We develop spectral augmentation which guides topology augmentations by maximizing the spectral change.
arXiv Detail & Related papers (2022-10-02T22:20:07Z) - Robust, Nonparametric, Efficient Decomposition of Spectral Peaks under
Distortion and Interference [0.0]
We propose a decomposition method for the spectral peaks in an observed frequency spectrum, which is efficiently acquired by utilizing the Fast Fourier Transform.
We model the peaks in spectrum as pseudo-symmetric functions, where the only constraint is a nonincreasing behavior around a central frequency when the distance increases.
Our approach is more robust against arbitrary distortion, interference and noise on the spectrum that may be caused by an observation system.
arXiv Detail & Related papers (2022-04-18T17:08:37Z) - Graph Structural Attack by Spectral Distance [35.998704625736394]
Graph Convolutional Networks (GCNs) have fueled a surge of interest due to their superior performance on graph learning tasks.
In this paper, an effective graph structural attack is investigated to disrupt graph spectral filters in the Fourier domain.
arXiv Detail & Related papers (2021-11-01T04:02:34Z) - Deeply Learned Spectral Total Variation Decomposition [8.679020335206753]
We present a neural network approximation of a non-linear spectral decomposition.
We report up to four orders of magnitude ($times 10,000$) speedup in processing of mega-pixel size images.
arXiv Detail & Related papers (2020-06-17T17:10:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.