Spectral Augmentation for Self-Supervised Learning on Graphs
- URL: http://arxiv.org/abs/2210.00643v2
- Date: Tue, 20 Jun 2023 18:24:52 GMT
- Title: Spectral Augmentation for Self-Supervised Learning on Graphs
- Authors: Lu Lin, Jinghui Chen, Hongning Wang
- Abstract summary: Graph contrastive learning (GCL) aims to learn representations via instance discrimination.
It relies on graph augmentation to reflect invariant patterns that are robust to small perturbations.
Recent studies mainly perform topology augmentations in a uniformly random manner in the spatial domain.
We develop spectral augmentation which guides topology augmentations by maximizing the spectral change.
- Score: 43.19199994575821
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph contrastive learning (GCL), as an emerging self-supervised learning
technique on graphs, aims to learn representations via instance discrimination.
Its performance heavily relies on graph augmentation to reflect invariant
patterns that are robust to small perturbations; yet it still remains unclear
about what graph invariance GCL should capture. Recent studies mainly perform
topology augmentations in a uniformly random manner in the spatial domain,
ignoring its influence on the intrinsic structural properties embedded in the
spectral domain. In this work, we aim to find a principled way for topology
augmentations by exploring the invariance of graphs from the spectral
perspective. We develop spectral augmentation which guides topology
augmentations by maximizing the spectral change. Extensive experiments on both
graph and node classification tasks demonstrate the effectiveness of our method
in self-supervised representation learning. The proposed method also brings
promising generalization capability in transfer learning, and is equipped with
intriguing robustness property under adversarial attacks. Our study sheds light
on a general principle for graph topology augmentation.
Related papers
- Heterogeneous Graph Contrastive Learning with Spectral Augmentation [15.231689595121553]
This paper introduces a spectral-enhanced graph contrastive learning model (SHCL) for the first time in heterogeneous graph neural networks.
The proposed model learns an adaptive topology augmentation scheme through the heterogeneous graph itself.
Experimental results on multiple real-world datasets demonstrate substantial advantages of the proposed model.
arXiv Detail & Related papers (2024-06-30T14:20:12Z) - Gradformer: Graph Transformer with Exponential Decay [69.50738015412189]
Self-attention mechanism in Graph Transformers (GTs) overlooks the graph's inductive biases, particularly biases related to structure.
This paper presents Gradformer, a method innovatively integrating GT with the intrinsic inductive bias.
Gradformer consistently outperforms the Graph Neural Network and GT baseline models in various graph classification and regression tasks.
arXiv Detail & Related papers (2024-04-24T08:37:13Z) - Through the Dual-Prism: A Spectral Perspective on Graph Data
Augmentation for Graph Classification [71.36575018271405]
We introduce the Dual-Prism (DP) augmentation method, comprising DP-Noise and DP-Mask.
We find that keeping the low-frequency eigenvalues unchanged can preserve the critical properties at a large scale when generating augmented graphs.
arXiv Detail & Related papers (2024-01-18T12:58:53Z) - Augment with Care: Enhancing Graph Contrastive Learning with Selective
Spectrum Perturbation [11.322569167679633]
Graph Contrastive Learning (GCL) has shown remarkable effectiveness in learning representations on graphs.
Existing augmentation views with perturbed graph structures are usually based on random topology corruption in the spatial domain.
We propose GASSER which poses tailored perturbation on the specific frequencies of graph structures in spectral domain.
arXiv Detail & Related papers (2023-10-20T22:39:07Z) - Advective Diffusion Transformers for Topological Generalization in Graph
Learning [69.2894350228753]
We show how graph diffusion equations extrapolate and generalize in the presence of varying graph topologies.
We propose a novel graph encoder backbone, Advective Diffusion Transformer (ADiT), inspired by advective graph diffusion equations.
arXiv Detail & Related papers (2023-10-10T08:40:47Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - Let Invariant Rationale Discovery Inspire Graph Contrastive Learning [98.10268114789775]
We argue that a high-performing augmentation should preserve the salient semantics of anchor graphs regarding instance-discrimination.
We propose a new framework, Rationale-aware Graph Contrastive Learning (RGCL)
RGCL uses a rationale generator to reveal salient features about graph instance-discrimination as the rationale, and then creates rationale-aware views for contrastive learning.
arXiv Detail & Related papers (2022-06-16T01:28:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.