CoCo: A Coupled Contrastive Framework for Unsupervised Domain Adaptive
Graph Classification
- URL: http://arxiv.org/abs/2306.04979v2
- Date: Sat, 10 Jun 2023 11:20:26 GMT
- Title: CoCo: A Coupled Contrastive Framework for Unsupervised Domain Adaptive
Graph Classification
- Authors: Nan Yin, Li Shen, Mengzhu Wang, Long Lan, Zeyu Ma, Chong Chen,
Xian-Sheng Hua, Xiao Luo
- Abstract summary: We propose Coupled Contrastive Graph Representation Learning (CoCo), which extracts the topological information from coupled learning branches.
CoCo outperforms competing baselines in different settings generally.
- Score: 32.479834854094214
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Although graph neural networks (GNNs) have achieved impressive achievements
in graph classification, they often need abundant task-specific labels, which
could be extensively costly to acquire. A credible solution is to explore
additional labeled graphs to enhance unsupervised learning on the target
domain. However, how to apply GNNs to domain adaptation remains unsolved owing
to the insufficient exploration of graph topology and the significant domain
discrepancy. In this paper, we propose Coupled Contrastive Graph Representation
Learning (CoCo), which extracts the topological information from coupled
learning branches and reduces the domain discrepancy with coupled contrastive
learning. CoCo contains a graph convolutional network branch and a hierarchical
graph kernel network branch, which explore graph topology in implicit and
explicit manners. Besides, we incorporate coupled branches into a holistic
multi-view contrastive learning framework, which not only incorporates graph
representations learned from complementary views for enhanced understanding,
but also encourages the similarity between cross-domain example pairs with the
same semantics for domain alignment. Extensive experiments on popular datasets
show that our CoCo outperforms these competing baselines in different settings
generally.
Related papers
- DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Domain Adaptive Graph Classification [0.0]
We introduce the Dual Adversarial Graph Representation Learning (DAGRL), which explore the graph topology from dual branches and mitigate domain discrepancies via dual adversarial learning.
Our approach incorporates adaptive perturbations into the dual branches, which align the source and target distribution to address domain discrepancies.
arXiv Detail & Related papers (2023-12-21T02:37:56Z) - TGNN: A Joint Semi-supervised Framework for Graph-level Classification [34.300070497510276]
We propose a novel semi-supervised framework called Twin Graph Neural Network (TGNN)
To explore graph structural information from complementary views, our TGNN has a message passing module and a graph kernel module.
We evaluate our TGNN on various public datasets and show that it achieves strong performance.
arXiv Detail & Related papers (2023-04-23T15:42:11Z) - Semantic Graph Neural Network with Multi-measure Learning for
Semi-supervised Classification [5.000404730573809]
Graph Neural Networks (GNNs) have attracted increasing attention in recent years.
Recent studies have shown that GNNs are vulnerable to the complex underlying structure of the graph.
We propose a novel framework for semi-supervised classification.
arXiv Detail & Related papers (2022-12-04T06:17:11Z) - Graph Representation Learning via Contrasting Cluster Assignments [57.87743170674533]
We propose a novel unsupervised graph representation model by contrasting cluster assignments, called as GRCCA.
It is motivated to make good use of local and global information synthetically through combining clustering algorithms and contrastive learning.
GRCCA has strong competitiveness in most tasks.
arXiv Detail & Related papers (2021-12-15T07:28:58Z) - Bridging the Gap between Spatial and Spectral Domains: A Unified
Framework for Graph Neural Networks [61.17075071853949]
Graph neural networks (GNNs) are designed to deal with graph-structural data that classical deep learning does not easily manage.
The purpose of this study is to establish a unified framework that integrates GNNs based on spectral graph and approximation theory.
arXiv Detail & Related papers (2021-07-21T17:34:33Z) - Multi-Level Graph Contrastive Learning [38.022118893733804]
We propose a Multi-Level Graph Contrastive Learning (MLGCL) framework for learning robust representation of graph data by contrasting space views of graphs.
The original graph is first-order approximation structure and contains uncertainty or error, while the $k$NN graph generated by encoding features preserves high-order proximity.
Extensive experiments indicate MLGCL achieves promising results compared with the existing state-of-the-art graph representation learning methods on seven datasets.
arXiv Detail & Related papers (2021-07-06T14:24:43Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z) - Graph Representation Learning via Graphical Mutual Information
Maximization [86.32278001019854]
We propose a novel concept, Graphical Mutual Information (GMI), to measure the correlation between input graphs and high-level hidden representations.
We develop an unsupervised learning model trained by maximizing GMI between the input and output of a graph neural encoder.
arXiv Detail & Related papers (2020-02-04T08:33:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.