Simple Contrastive Graph Clustering
- URL: http://arxiv.org/abs/2205.07865v1
- Date: Wed, 11 May 2022 06:45:19 GMT
- Title: Simple Contrastive Graph Clustering
- Authors: Yue Liu, Xihong Yang, Sihang Zhou, Xinwang Liu
- Abstract summary: We propose a Simple Contrastive Graph Clustering (SCGC) algorithm to improve the existing methods.
Our algorithm outperforms the recent contrastive deep clustering competitors with at least seven times speedup on average.
- Score: 41.396185271303956
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Contrastive learning has recently attracted plenty of attention in deep graph
clustering for its promising performance. However, complicated data
augmentations and time-consuming graph convolutional operation undermine the
efficiency of these methods. To solve this problem, we propose a Simple
Contrastive Graph Clustering (SCGC) algorithm to improve the existing methods
from the perspectives of network architecture, data augmentation, and objective
function. As to the architecture, our network includes two main parts, i.e.,
pre-processing and network backbone. A simple low-pass denoising operation
conducts neighbor information aggregation as an independent pre-processing, and
only two multilayer perceptrons (MLPs) are included as the backbone. For data
augmentation, instead of introducing complex operations over graphs, we
construct two augmented views of the same vertex by designing parameter
un-shared siamese encoders and corrupting the node embeddings directly.
Finally, as to the objective function, to further improve the clustering
performance, a novel cross-view structural consistency objective function is
designed to enhance the discriminative capability of the learned network.
Extensive experimental results on seven benchmark datasets validate our
proposed algorithm's effectiveness and superiority. Significantly, our
algorithm outperforms the recent contrastive deep clustering competitors with
at least seven times speedup on average.
Related papers
- Efficient Heterogeneous Graph Learning via Random Projection [58.4138636866903]
Heterogeneous Graph Neural Networks (HGNNs) are powerful tools for deep learning on heterogeneous graphs.
Recent pre-computation-based HGNNs use one-time message passing to transform a heterogeneous graph into regular-shaped tensors.
We propose a hybrid pre-computation-based HGNN, named Random Projection Heterogeneous Graph Neural Network (RpHGNN)
arXiv Detail & Related papers (2023-10-23T01:25:44Z) - Interpolation-based Correlation Reduction Network for Semi-Supervised
Graph Learning [49.94816548023729]
We propose a novel graph contrastive learning method, termed Interpolation-based Correlation Reduction Network (ICRN)
In our method, we improve the discriminative capability of the latent feature by enlarging the margin of decision boundaries.
By combining the two settings, we extract rich supervision information from both the abundant unlabeled nodes and the rare yet valuable labeled nodes for discnative representation learning.
arXiv Detail & Related papers (2022-06-06T14:26:34Z) - ClusterGNN: Cluster-based Coarse-to-Fine Graph Neural Network for
Efficient Feature Matching [15.620335576962475]
ClusterGNN is an attentional GNN architecture which operates on clusters for learning the feature matching task.
Our approach yields a 59.7% reduction in runtime and 58.4% reduction in memory consumption for dense detection.
arXiv Detail & Related papers (2022-04-25T14:43:15Z) - Deep Graph Clustering via Dual Correlation Reduction [37.973072977988494]
We propose a novel self-supervised deep graph clustering method termed Dual Correlation Reduction Network (DCRN)
In our method, we first design a siamese network to encode samples. Then by forcing the cross-view sample correlation matrix and cross-view feature correlation matrix to approximate two identity matrices, respectively, we reduce the information correlation in the dual-level.
In order to alleviate representation collapse caused by over-smoothing in GCN, we introduce a propagation regularization term to enable the network to gain long-distance information.
arXiv Detail & Related papers (2021-12-29T04:05:38Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z) - Graph Contrastive Learning with Adaptive Augmentation [23.37786673825192]
We propose a novel graph contrastive representation learning method with adaptive augmentation.
Specifically, we design augmentation schemes based on node centrality measures to highlight important connective structures.
Our proposed method consistently outperforms existing state-of-the-art baselines and even surpasses some supervised counterparts.
arXiv Detail & Related papers (2020-10-27T15:12:21Z) - Learning to Cluster Faces via Confidence and Connectivity Estimation [136.5291151775236]
We propose a fully learnable clustering framework without requiring a large number of overlapped subgraphs.
Our method significantly improves clustering accuracy and thus performance of the recognition models trained on top, yet it is an order of magnitude more efficient than existing supervised methods.
arXiv Detail & Related papers (2020-04-01T13:39:37Z) - Learning to Hash with Graph Neural Networks for Recommender Systems [103.82479899868191]
Graph representation learning has attracted much attention in supporting high quality candidate search at scale.
Despite its effectiveness in learning embedding vectors for objects in the user-item interaction network, the computational costs to infer users' preferences in continuous embedding space are tremendous.
We propose a simple yet effective discrete representation learning framework to jointly learn continuous and discrete codes.
arXiv Detail & Related papers (2020-03-04T06:59:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.