Dual-disentangled Deep Multiple Clustering
- URL: http://arxiv.org/abs/2402.05310v1
- Date: Wed, 7 Feb 2024 23:05:30 GMT
- Title: Dual-disentangled Deep Multiple Clustering
- Authors: Jiawei Yao and Juhua Hu
- Abstract summary: We propose a novel Dual-Disentangled deep Multiple Clustering method named DDMC by learning disentangled representations.
Our experiments demonstrate that DDMC consistently outperforms state-of-the-art methods across seven commonly used tasks.
- Score: 4.040415173465543
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multiple clustering has gathered significant attention in recent years due to
its potential to reveal multiple hidden structures of the data from different
perspectives. Most of multiple clustering methods first derive feature
representations by controlling the dissimilarity among them, subsequently
employing traditional clustering methods (e.g., k-means) to achieve the final
multiple clustering outcomes. However, the learned feature representations can
exhibit a weak relevance to the ultimate goal of distinct clustering. Moreover,
these features are often not explicitly learned for the purpose of clustering.
Therefore, in this paper, we propose a novel Dual-Disentangled deep Multiple
Clustering method named DDMC by learning disentangled representations.
Specifically, DDMC is achieved by a variational Expectation-Maximization (EM)
framework. In the E-step, the disentanglement learning module employs
coarse-grained and fine-grained disentangled representations to obtain a more
diverse set of latent factors from the data. In the M-step, the cluster
assignment module utilizes a cluster objective function to augment the
effectiveness of the cluster output. Our extensive experiments demonstrate that
DDMC consistently outperforms state-of-the-art methods across seven commonly
used tasks. Our code is available at https://github.com/Alexander-Yao/DDMC.
Related papers
- Dying Clusters Is All You Need -- Deep Clustering With an Unknown Number of Clusters [5.507296054825372]
Finding meaningful groups in high-dimensional data is an important challenge in data mining.
Deep clustering methods have achieved remarkable results in these tasks.
Most of these methods require the user to specify the number of clusters in advance.
This is a major limitation since the number of clusters is typically unknown if labeled data is unavailable.
Most of these approaches estimate the number of clusters separated from the clustering process.
arXiv Detail & Related papers (2024-10-12T11:04:10Z) - Self Supervised Correlation-based Permutations for Multi-View Clustering [7.972599673048582]
We propose an end-to-end deep learning-based MVC framework for general data.
Our approach involves learning meaningful fused data representations with a novel permutation-based canonical correlation objective.
We demonstrate the effectiveness of our model using ten MVC benchmark datasets.
arXiv Detail & Related papers (2024-02-26T08:08:30Z) - Reinforcement Graph Clustering with Unknown Cluster Number [91.4861135742095]
We propose a new deep graph clustering method termed Reinforcement Graph Clustering.
In our proposed method, cluster number determination and unsupervised representation learning are unified into a uniform framework.
In order to conduct feedback actions, the clustering-oriented reward function is proposed to enhance the cohesion of the same clusters and separate the different clusters.
arXiv Detail & Related papers (2023-08-13T18:12:28Z) - AugDMC: Data Augmentation Guided Deep Multiple Clustering [2.479720095773358]
AugDMC is a novel data Augmentation guided Deep Multiple Clustering method.
It exploits data augmentations to automatically extract features related to a certain aspect of the data.
A stable optimization strategy is proposed to alleviate the unstable problem from different augmentations.
arXiv Detail & Related papers (2023-06-22T16:31:46Z) - One-step Multi-view Clustering with Diverse Representation [47.41455937479201]
We propose a one-step multi-view clustering with diverse representation method, which incorporates multi-view learning and $k$-means into a unified framework.
We develop an efficient optimization algorithm with proven convergence to solve the resultant problem.
arXiv Detail & Related papers (2023-06-08T02:52:24Z) - MHCCL: Masked Hierarchical Cluster-Wise Contrastive Learning for
Multivariate Time Series [20.008535430484475]
Masked Hierarchical Cluster-wise Contrastive Learning model is presented.
It exploits semantic information obtained from the hierarchical structure consisting of multiple latent partitions for time series.
It is shown to be superior to state-of-the-art approaches for unsupervised time series representation learning.
arXiv Detail & Related papers (2022-12-02T12:42:53Z) - Unified Multi-View Orthonormal Non-Negative Graph Based Clustering
Framework [74.25493157757943]
We formulate a novel clustering model, which exploits the non-negative feature property and incorporates the multi-view information into a unified joint learning framework.
We also explore, for the first time, the multi-model non-negative graph-based approach to clustering data based on deep features.
arXiv Detail & Related papers (2022-11-03T08:18:27Z) - DeepCluE: Enhanced Image Clustering via Multi-layer Ensembles in Deep
Neural Networks [53.88811980967342]
This paper presents a Deep Clustering via Ensembles (DeepCluE) approach.
It bridges the gap between deep clustering and ensemble clustering by harnessing the power of multiple layers in deep neural networks.
Experimental results on six image datasets confirm the advantages of DeepCluE over the state-of-the-art deep clustering approaches.
arXiv Detail & Related papers (2022-06-01T09:51:38Z) - Deep Attention-guided Graph Clustering with Dual Self-supervision [49.040136530379094]
We propose a novel method, namely deep attention-guided graph clustering with dual self-supervision (DAGC)
We develop a dual self-supervision solution consisting of a soft self-supervision strategy with a triplet Kullback-Leibler divergence loss and a hard self-supervision strategy with a pseudo supervision loss.
Our method consistently outperforms state-of-the-art methods on six benchmark datasets.
arXiv Detail & Related papers (2021-11-10T06:53:03Z) - Scalable Hierarchical Agglomerative Clustering [65.66407726145619]
Existing scalable hierarchical clustering methods sacrifice quality for speed.
We present a scalable, agglomerative method for hierarchical clustering that does not sacrifice quality and scales to billions of data points.
arXiv Detail & Related papers (2020-10-22T15:58:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.