Contrastive Disentangled Learning on Graph for Node Classification
- URL: http://arxiv.org/abs/2306.11344v1
- Date: Tue, 20 Jun 2023 07:25:14 GMT
- Title: Contrastive Disentangled Learning on Graph for Node Classification
- Authors: Xiaojuan Zhang and Jun Fu and Shuang Li
- Abstract summary: We propose a novel framework for contrastive disentangled learning on graphs, employing a disentangled graph encoder and two carefully crafted self-supervision signals.
Specifically, we introduce a disentangled graph encoder to enforce the framework to distinguish various latent factors corresponding to underlying semantic information.
To overcome the heavy reliance on labels, we design two self-supervision signals, namely node specificity and channel independence, which capture informative knowledge without the need for labeled data.
- Score: 11.678287036601564
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Contrastive learning methods have attracted considerable attention due to
their remarkable success in analyzing graph-structured data. Inspired by the
success of contrastive learning, we propose a novel framework for contrastive
disentangled learning on graphs, employing a disentangled graph encoder and two
carefully crafted self-supervision signals. Specifically, we introduce a
disentangled graph encoder to enforce the framework to distinguish various
latent factors corresponding to underlying semantic information and learn the
disentangled node embeddings. Moreover, to overcome the heavy reliance on
labels, we design two self-supervision signals, namely node specificity and
channel independence, which capture informative knowledge without the need for
labeled data, thereby guiding the automatic disentanglement of nodes. Finally,
we perform node classification tasks on three citation networks by using the
disentangled node embeddings, and the relevant analysis is provided.
Experimental results validate the effectiveness of the proposed framework
compared with various baselines.
Related papers
- CONVERT:Contrastive Graph Clustering with Reliable Augmentation [110.46658439733106]
We propose a novel CONtrastiVe Graph ClustEring network with Reliable AugmenTation (CONVERT)
In our method, the data augmentations are processed by the proposed reversible perturb-recover network.
To further guarantee the reliability of semantics, a novel semantic loss is presented to constrain the network.
arXiv Detail & Related papers (2023-08-17T13:07:09Z) - Coarse-to-Fine Contrastive Learning on Graphs [38.41992365090377]
A variety of graph augmentation strategies have been employed to learn node representations in a self-supervised manner.
We introduce a self-ranking paradigm to ensure that the discriminative information among different nodes can be maintained.
Experiment results on various benchmark datasets verify the effectiveness of our algorithm.
arXiv Detail & Related papers (2022-12-13T08:17:20Z) - Supervised Contrastive Learning with Structure Inference for Graph
Classification [5.276232626689567]
We propose a graph neural network based on supervised contrastive learning and structure inference for graph classification.
With the integration of label information, the one-vs-many contrastive learning can be extended to a many-vs-many setting.
Experiment results show the effectiveness of the proposed method compared with recent state-of-the-art methods.
arXiv Detail & Related papers (2022-03-15T07:18:46Z) - Bayesian Graph Contrastive Learning [55.36652660268726]
We propose a novel perspective of graph contrastive learning methods showing random augmentations leads to encoders.
Our proposed method represents each node by a distribution in the latent space in contrast to existing techniques which embed each node to a deterministic vector.
We show a considerable improvement in performance compared to existing state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-12-15T01:45:32Z) - Noise-robust Graph Learning by Estimating and Leveraging Pairwise
Interactions [123.07967420310796]
This paper bridges the gap by proposing a pairwise framework for noisy node classification on graphs.
PI-GNN relies on the PI as a primary learning proxy in addition to the pointwise learning from the noisy node class labels.
Our proposed framework PI-GNN contributes two novel components: (1) a confidence-aware PI estimation model that adaptively estimates the PI labels, and (2) a decoupled training approach that leverages the estimated PI labels.
arXiv Detail & Related papers (2021-06-14T14:23:08Z) - Self-Supervised Graph Learning with Proximity-based Views and Channel
Contrast [4.761137180081091]
Graph neural networks (GNNs) use neighborhood aggregation as a core component that results in feature smoothing among nodes in proximity.
To tackle this problem, we strengthen the graph with two additional graph views, in which nodes are directly linked to those with the most similar features or local structures.
We propose a method that aims to maximize the agreement between representations across generated views and the original graph.
arXiv Detail & Related papers (2021-06-07T15:38:36Z) - Information Obfuscation of Graph Neural Networks [96.8421624921384]
We study the problem of protecting sensitive attributes by information obfuscation when learning with graph structured data.
We propose a framework to locally filter out pre-determined sensitive attributes via adversarial training with the total variation and the Wasserstein distance.
arXiv Detail & Related papers (2020-09-28T17:55:04Z) - Contrastive and Generative Graph Convolutional Networks for Graph-based
Semi-Supervised Learning [64.98816284854067]
Graph-based Semi-Supervised Learning (SSL) aims to transfer the labels of a handful of labeled data to the remaining massive unlabeled data via a graph.
A novel GCN-based SSL algorithm is presented in this paper to enrich the supervision signals by utilizing both data similarities and graph structure.
arXiv Detail & Related papers (2020-09-15T13:59:28Z) - GraphCL: Contrastive Self-Supervised Learning of Graph Representations [20.439666392958284]
We propose Graph Contrastive Learning (GraphCL), a general framework for learning node representations in a self supervised manner.
We use graph neural networks to produce two representations of the same node and leverage a contrastive learning loss to maximize agreement between them.
In both transductive and inductive learning setups, we demonstrate that our approach significantly outperforms the state-of-the-art in unsupervised learning on a number of node classification benchmarks.
arXiv Detail & Related papers (2020-07-15T22:36:53Z) - Deep Graph Contrastive Representation Learning [23.37786673825192]
We propose a novel framework for unsupervised graph representation learning by leveraging a contrastive objective at the node level.
Specifically, we generate two graph views by corruption and learn node representations by maximizing the agreement of node representations in these two views.
We perform empirical experiments on both transductive and inductive learning tasks using a variety of real-world datasets.
arXiv Detail & Related papers (2020-06-07T11:50:45Z) - Graph Inference Learning for Semi-supervised Classification [50.55765399527556]
We propose a Graph Inference Learning framework to boost the performance of semi-supervised node classification.
For learning the inference process, we introduce meta-optimization on structure relations from training nodes to validation nodes.
Comprehensive evaluations on four benchmark datasets demonstrate the superiority of our proposed GIL when compared against state-of-the-art methods.
arXiv Detail & Related papers (2020-01-17T02:52:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.