Dual Space Graph Contrastive Learning
- URL: http://arxiv.org/abs/2201.07409v1
- Date: Wed, 19 Jan 2022 04:10:29 GMT
- Title: Dual Space Graph Contrastive Learning
- Authors: Haoran Yang, Hongxu Chen, Shirui Pan, Lin Li, Philip S. Yu, Guandong
Xu
- Abstract summary: We propose a novel graph contrastive learning method, namely textbfDual textbfSpace textbfGraph textbfContrastive (DSGC) Learning.
Since both spaces have their own advantages to represent graph data in the embedding spaces, we hope to utilize graph contrastive learning to bridge the spaces and leverage advantages from both sides.
- Score: 82.81372024482202
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Unsupervised graph representation learning has emerged as a powerful tool to
address real-world problems and achieves huge success in the graph learning
domain. Graph contrastive learning is one of the unsupervised graph
representation learning methods, which recently attracts attention from
researchers and has achieved state-of-the-art performances on various tasks.
The key to the success of graph contrastive learning is to construct proper
contrasting pairs to acquire the underlying structural semantics of the graph.
However, this key part is not fully explored currently, most of the ways
generating contrasting pairs focus on augmenting or perturbating graph
structures to obtain different views of the input graph. But such strategies
could degrade the performances via adding noise into the graph, which may
narrow down the field of the applications of graph contrastive learning. In
this paper, we propose a novel graph contrastive learning method, namely
\textbf{D}ual \textbf{S}pace \textbf{G}raph \textbf{C}ontrastive (DSGC)
Learning, to conduct graph contrastive learning among views generated in
different spaces including the hyperbolic space and the Euclidean space. Since
both spaces have their own advantages to represent graph data in the embedding
spaces, we hope to utilize graph contrastive learning to bridge the spaces and
leverage advantages from both sides. The comparison experiment results show
that DSGC achieves competitive or better performances among all the datasets.
In addition, we conduct extensive experiments to analyze the impact of
different graph encoders on DSGC, giving insights about how to better leverage
the advantages of contrastive learning between different spaces.
Related papers
- Subgraph Networks Based Contrastive Learning [5.736011243152416]
Graph contrastive learning (GCL) can solve the problem of annotated data scarcity.
Most existing GCL methods focus on the design of graph augmentation strategies and mutual information estimation operations.
We propose a novel framework called subgraph network-based contrastive learning (SGNCL)
arXiv Detail & Related papers (2023-06-06T08:52:44Z) - Capturing Fine-grained Semantics in Contrastive Graph Representation
Learning [23.861016307326146]
Graph contrastive learning defines a contrastive task to pull similar instances close and push dissimilar instances away.
Existing methods of graph contrastive learning ignore the differences between diverse semantics existed in graphs.
We propose a novel Fine-grained Semantics enhanced Graph Contrastive Learning (FSGCL) in this paper.
arXiv Detail & Related papers (2023-04-23T14:05:05Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - ARIEL: Adversarial Graph Contrastive Learning [51.14695794459399]
ARIEL consistently outperforms the current graph contrastive learning methods for both node-level and graph-level classification tasks.
ARIEL is more robust in the face of adversarial attacks.
arXiv Detail & Related papers (2022-08-15T01:24:42Z) - GraphCoCo: Graph Complementary Contrastive Learning [65.89743197355722]
Graph Contrastive Learning (GCL) has shown promising performance in graph representation learning (GRL) without the supervision of manual annotations.
This paper proposes an effective graph complementary contrastive learning approach named GraphCoCo to tackle the above issue.
arXiv Detail & Related papers (2022-03-24T02:58:36Z) - Adversarial Graph Contrastive Learning with Information Regularization [51.14695794459399]
Contrastive learning is an effective method in graph representation learning.
Data augmentation on graphs is far less intuitive and much harder to provide high-quality contrastive samples.
We propose a simple but effective method, Adversarial Graph Contrastive Learning (ARIEL)
It consistently outperforms the current graph contrastive learning methods in the node classification task over various real-world datasets.
arXiv Detail & Related papers (2022-02-14T05:54:48Z) - Graph Self-supervised Learning with Accurate Discrepancy Learning [64.69095775258164]
We propose a framework that aims to learn the exact discrepancy between the original and the perturbed graphs, coined as Discrepancy-based Self-supervised LeArning (D-SLA)
We validate our method on various graph-related downstream tasks, including molecular property prediction, protein function prediction, and link prediction tasks, on which our model largely outperforms relevant baselines.
arXiv Detail & Related papers (2022-02-07T08:04:59Z) - Multi-Level Graph Contrastive Learning [38.022118893733804]
We propose a Multi-Level Graph Contrastive Learning (MLGCL) framework for learning robust representation of graph data by contrasting space views of graphs.
The original graph is first-order approximation structure and contains uncertainty or error, while the $k$NN graph generated by encoding features preserves high-order proximity.
Extensive experiments indicate MLGCL achieves promising results compared with the existing state-of-the-art graph representation learning methods on seven datasets.
arXiv Detail & Related papers (2021-07-06T14:24:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.