Learning Uniform Clusters on Hypersphere for Deep Graph-level Clustering
- URL: http://arxiv.org/abs/2311.13953v1
- Date: Thu, 23 Nov 2023 12:08:20 GMT
- Title: Learning Uniform Clusters on Hypersphere for Deep Graph-level Clustering
- Authors: Mengling Hu, Chaochao Chen, Weiming Liu, Xinyi Zhang, Xinting Liao,
and Xiaolin Zheng
- Abstract summary: We propose a novel deep graph-level clustering method called Uniform Deep Graph Clustering (UDGC)
UDGC assigns instances evenly to different clusters and then scatters those clusters on unit hypersphere, leading to a more uniform cluster-level distribution and a slighter cluster collapse.
Our empirical study on eight well-known datasets demonstrates that UDGC significantly outperforms the state-of-the-art models.
- Score: 25.350054742471816
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph clustering has been popularly studied in recent years. However, most
existing graph clustering methods focus on node-level clustering, i.e.,
grouping nodes in a single graph into clusters. In contrast, graph-level
clustering, i.e., grouping multiple graphs into clusters, remains largely
unexplored. Graph-level clustering is critical in a variety of real-world
applications, such as, properties prediction of molecules and community
analysis in social networks. However, graph-level clustering is challenging due
to the insufficient discriminability of graph-level representations, and the
insufficient discriminability makes deep clustering be more likely to obtain
degenerate solutions (cluster collapse). To address the issue, we propose a
novel deep graph-level clustering method called Uniform Deep Graph Clustering
(UDGC). UDGC assigns instances evenly to different clusters and then scatters
those clusters on unit hypersphere, leading to a more uniform cluster-level
distribution and a slighter cluster collapse. Specifically, we first propose
Augmentation-Consensus Optimal Transport (ACOT) for generating uniformly
distributed and reliable pseudo labels for partitioning clusters. Then we adopt
contrastive learning to scatter those clusters. Besides, we propose Center
Alignment Optimal Transport (CAOT) for guiding the model to learn better
parameters, which further promotes the cluster performance. Our empirical study
on eight well-known datasets demonstrates that UDGC significantly outperforms
the state-of-the-art models.
Related papers
- Reinforcement Graph Clustering with Unknown Cluster Number [91.4861135742095]
We propose a new deep graph clustering method termed Reinforcement Graph Clustering.
In our proposed method, cluster number determination and unsupervised representation learning are unified into a uniform framework.
In order to conduct feedback actions, the clustering-oriented reward function is proposed to enhance the cohesion of the same clusters and separate the different clusters.
arXiv Detail & Related papers (2023-08-13T18:12:28Z) - Dink-Net: Neural Clustering on Large Graphs [59.10189693120368]
A deep graph clustering method (Dink-Net) is proposed with the idea of dilation and shrink.
By discriminating nodes, whether being corrupted by augmentations, representations are learned in a self-supervised manner.
The clustering distribution is optimized by minimizing the proposed cluster dilation loss and cluster shrink loss.
Compared to the runner-up, Dink-Net 9.62% achieves NMI improvement on the ogbn-papers100M dataset with 111 million nodes and 1.6 billion edges.
arXiv Detail & Related papers (2023-05-28T15:33:24Z) - Total Variation Graph Neural Networks [5.571369922847262]
Recently proposed Graph Neural Networks (GNNs) are trained with an unsupervised minimum cut objective.
We propose a GNN model that computes cluster assignments by optimizing a tighter relaxation of the minimum cut.
arXiv Detail & Related papers (2022-11-11T14:13:14Z) - GLCC: A General Framework for Graph-level Clustering [5.069852282550117]
This paper studies the problem of graph-level clustering, which is a novel yet challenging task.
We propose a general graph-level clustering framework named Graph-Level Contrastive Clustering (GLCC)
Experiments on a range of well-known datasets demonstrate the superiority of our proposed GLCC over competitive baselines.
arXiv Detail & Related papers (2022-10-21T11:08:10Z) - DeepCluE: Enhanced Image Clustering via Multi-layer Ensembles in Deep
Neural Networks [53.88811980967342]
This paper presents a Deep Clustering via Ensembles (DeepCluE) approach.
It bridges the gap between deep clustering and ensemble clustering by harnessing the power of multiple layers in deep neural networks.
Experimental results on six image datasets confirm the advantages of DeepCluE over the state-of-the-art deep clustering approaches.
arXiv Detail & Related papers (2022-06-01T09:51:38Z) - Graph Representation Learning via Contrasting Cluster Assignments [57.87743170674533]
We propose a novel unsupervised graph representation model by contrasting cluster assignments, called as GRCCA.
It is motivated to make good use of local and global information synthetically through combining clustering algorithms and contrastive learning.
GRCCA has strong competitiveness in most tasks.
arXiv Detail & Related papers (2021-12-15T07:28:58Z) - Self-supervised Contrastive Attributed Graph Clustering [110.52694943592974]
We propose a novel attributed graph clustering network, namely Self-supervised Contrastive Attributed Graph Clustering (SCAGC)
In SCAGC, by leveraging inaccurate clustering labels, a self-supervised contrastive loss, are designed for node representation learning.
For the OOS nodes, SCAGC can directly calculate their clustering labels.
arXiv Detail & Related papers (2021-10-15T03:25:28Z) - Scalable Hierarchical Agglomerative Clustering [65.66407726145619]
Existing scalable hierarchical clustering methods sacrifice quality for speed.
We present a scalable, agglomerative method for hierarchical clustering that does not sacrifice quality and scales to billions of data points.
arXiv Detail & Related papers (2020-10-22T15:58:35Z) - Incorporating User's Preference into Attributed Graph Clustering [14.082520165369885]
We propose two quality measures for a local cluster: Graph Unimodality (GU) and Attribute Unimodality (AU)
The local cluster detected by LOCLU concentrates on the region of interest, provides efficient information flow in the graph and exhibits a unimodal data distribution in the subspace of the designated attributes.
arXiv Detail & Related papers (2020-03-24T19:07:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.