GLCC: A General Framework for Graph-level Clustering
- URL: http://arxiv.org/abs/2210.11879v1
- Date: Fri, 21 Oct 2022 11:08:10 GMT
- Title: GLCC: A General Framework for Graph-level Clustering
- Authors: Wei Ju, Yiyang Gu, Binqi Chen, Gongbo Sun, Yifang Qin, Xingyuming Liu,
Xiao Luo, Ming Zhang
- Abstract summary: This paper studies the problem of graph-level clustering, which is a novel yet challenging task.
We propose a general graph-level clustering framework named Graph-Level Contrastive Clustering (GLCC)
Experiments on a range of well-known datasets demonstrate the superiority of our proposed GLCC over competitive baselines.
- Score: 5.069852282550117
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper studies the problem of graph-level clustering, which is a novel
yet challenging task. This problem is critical in a variety of real-world
applications such as protein clustering and genome analysis in bioinformatics.
Recent years have witnessed the success of deep clustering coupled with graph
neural networks (GNNs). However, existing methods focus on clustering among
nodes given a single graph, while exploring clustering on multiple graphs is
still under-explored. In this paper, we propose a general graph-level
clustering framework named Graph-Level Contrastive Clustering (GLCC) given
multiple graphs. Specifically, GLCC first constructs an adaptive affinity graph
to explore instance- and cluster-level contrastive learning (CL).
Instance-level CL leverages graph Laplacian based contrastive loss to learn
clustering-friendly representations while cluster-level CL captures
discriminative cluster representations incorporating neighbor information of
each sample. Moreover, we utilize neighbor-aware pseudo-labels to reward the
optimization of representation learning. The two steps can be alternatively
trained to collaborate and benefit each other. Experiments on a range of
well-known datasets demonstrate the superiority of our proposed GLCC over
competitive baselines.
Related papers
- Deep Contrastive Graph Learning with Clustering-Oriented Guidance [61.103996105756394]
Graph Convolutional Network (GCN) has exhibited remarkable potential in improving graph-based clustering.
Models estimate an initial graph beforehand to apply GCN.
Deep Contrastive Graph Learning (DCGL) model is proposed for general data clustering.
arXiv Detail & Related papers (2024-02-25T07:03:37Z) - Learning Uniform Clusters on Hypersphere for Deep Graph-level Clustering [25.350054742471816]
We propose a novel deep graph-level clustering method called Uniform Deep Graph Clustering (UDGC)
UDGC assigns instances evenly to different clusters and then scatters those clusters on unit hypersphere, leading to a more uniform cluster-level distribution and a slighter cluster collapse.
Our empirical study on eight well-known datasets demonstrates that UDGC significantly outperforms the state-of-the-art models.
arXiv Detail & Related papers (2023-11-23T12:08:20Z) - Kernel-based Joint Multiple Graph Learning and Clustering of Graph
Signals [2.4305626489408465]
We introduce Kernel-based joint Multiple GL and clustering of graph signals applications.
Experiments demonstrate that KMGL significantly enhances the robustness of GL clustering, particularly in scenarios with high noise levels.
These findings underscore the potential of KMGL for improving the performance of Graph Signal Processing methods in diverse real-world applications.
arXiv Detail & Related papers (2023-10-29T13:41:12Z) - Reinforcement Graph Clustering with Unknown Cluster Number [91.4861135742095]
We propose a new deep graph clustering method termed Reinforcement Graph Clustering.
In our proposed method, cluster number determination and unsupervised representation learning are unified into a uniform framework.
In order to conduct feedback actions, the clustering-oriented reward function is proposed to enhance the cohesion of the same clusters and separate the different clusters.
arXiv Detail & Related papers (2023-08-13T18:12:28Z) - Graph Representation Learning via Contrasting Cluster Assignments [57.87743170674533]
We propose a novel unsupervised graph representation model by contrasting cluster assignments, called as GRCCA.
It is motivated to make good use of local and global information synthetically through combining clustering algorithms and contrastive learning.
GRCCA has strong competitiveness in most tasks.
arXiv Detail & Related papers (2021-12-15T07:28:58Z) - Self-supervised Contrastive Attributed Graph Clustering [110.52694943592974]
We propose a novel attributed graph clustering network, namely Self-supervised Contrastive Attributed Graph Clustering (SCAGC)
In SCAGC, by leveraging inaccurate clustering labels, a self-supervised contrastive loss, are designed for node representation learning.
For the OOS nodes, SCAGC can directly calculate their clustering labels.
arXiv Detail & Related papers (2021-10-15T03:25:28Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z) - Structured Graph Learning for Clustering and Semi-supervised
Classification [74.35376212789132]
We propose a graph learning framework to preserve both the local and global structure of data.
Our method uses the self-expressiveness of samples to capture the global structure and adaptive neighbor approach to respect the local structure.
Our model is equivalent to a combination of kernel k-means and k-means methods under certain condition.
arXiv Detail & Related papers (2020-08-31T08:41:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.