Deep Clustering With Consensus Representations
- URL: http://arxiv.org/abs/2210.07063v1
- Date: Thu, 13 Oct 2022 14:40:48 GMT
- Title: Deep Clustering With Consensus Representations
- Authors: Lukas Miklautz, Martin Teuffenbach, Pascal Weber, Rona Perjuci, Walid
Durani, Christian B\"ohm, Claudia Plant
- Abstract summary: We introduce the idea of learning consensus representations for heterogeneous clusterings, a novel notion to approach consensus clustering.
We propose DECCS, the first deep clustering method that jointly improves the representation and clustering results of multiple heterogeneous clustering algorithms.
- Score: 10.058084837348366
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The field of deep clustering combines deep learning and clustering to learn
representations that improve both the learned representation and the
performance of the considered clustering method. Most existing deep clustering
methods are designed for a single clustering method, e.g., k-means, spectral
clustering, or Gaussian mixture models, but it is well known that no clustering
algorithm works best in all circumstances. Consensus clustering tries to
alleviate the individual weaknesses of clustering algorithms by building a
consensus between members of a clustering ensemble. Currently, there is no deep
clustering method that can include multiple heterogeneous clustering algorithms
in an ensemble to update representations and clusterings together. To close
this gap, we introduce the idea of a consensus representation that maximizes
the agreement between ensemble members. Further, we propose DECCS (Deep
Embedded Clustering with Consensus representationS), a deep consensus
clustering method that learns a consensus representation by enhancing the
embedded space to such a degree that all ensemble members agree on a common
clustering result. Our contributions are the following: (1) We introduce the
idea of learning consensus representations for heterogeneous clusterings, a
novel notion to approach consensus clustering. (2) We propose DECCS, the first
deep clustering method that jointly improves the representation and clustering
results of multiple heterogeneous clustering algorithms. (3) We show in
experiments that learning a consensus representation with DECCS is
outperforming several relevant baselines from deep clustering and consensus
clustering. Our code can be found at https://gitlab.cs.univie.ac.at/lukas/deccs
Related papers
- Similarity and Dissimilarity Guided Co-association Matrix Construction for Ensemble Clustering [22.280221709474105]
We propose the Similarity and Dissimilarity Guided Co-association matrix (SDGCA) to achieve ensemble clustering.
First, we introduce normalized ensemble entropy to estimate the quality of each cluster, and construct a similarity matrix based on this estimation.
We employ the random walk to explore high-order proximity of base clusterings to construct a dissimilarity matrix.
arXiv Detail & Related papers (2024-11-01T08:10:28Z) - Towards Explainable Clustering: A Constrained Declarative based Approach [0.294944680995069]
We aim at finding a clustering that has high quality in terms of classic clustering criteria and that is explainable.
A good global explanation of a clustering should give the characteristics of each cluster taking into account their abilities to describe its objects.
We propose a novel interpretable constrained method called ECS for declarative computation with Explainabilty-driven Selection.
arXiv Detail & Related papers (2024-03-26T21:00:06Z) - Reinforcement Graph Clustering with Unknown Cluster Number [91.4861135742095]
We propose a new deep graph clustering method termed Reinforcement Graph Clustering.
In our proposed method, cluster number determination and unsupervised representation learning are unified into a uniform framework.
In order to conduct feedback actions, the clustering-oriented reward function is proposed to enhance the cohesion of the same clusters and separate the different clusters.
arXiv Detail & Related papers (2023-08-13T18:12:28Z) - DivClust: Controlling Diversity in Deep Clustering [47.85350249697335]
DivClust produces consensus clustering solutions that consistently outperform single-clustering baselines.
Our method effectively controls diversity across frameworks and datasets with very small additional computational cost.
arXiv Detail & Related papers (2023-04-03T14:45:43Z) - Cluster-level Group Representativity Fairness in $k$-means Clustering [3.420467786581458]
Clustering algorithms could generate clusters such that different groups are disadvantaged within different clusters.
We develop a clustering algorithm, building upon the centroid clustering paradigm pioneered by classical algorithms.
We show that our method is effective in enhancing cluster-level group representativity fairness significantly at low impact on cluster coherence.
arXiv Detail & Related papers (2022-12-29T22:02:28Z) - Deep Clustering: A Comprehensive Survey [53.387957674512585]
Clustering analysis plays an indispensable role in machine learning and data mining.
Deep clustering, which can learn clustering-friendly representations using deep neural networks, has been broadly applied in a wide range of clustering tasks.
Existing surveys for deep clustering mainly focus on the single-view fields and the network architectures, ignoring the complex application scenarios of clustering.
arXiv Detail & Related papers (2022-10-09T02:31:32Z) - DeepCluE: Enhanced Image Clustering via Multi-layer Ensembles in Deep
Neural Networks [53.88811980967342]
This paper presents a Deep Clustering via Ensembles (DeepCluE) approach.
It bridges the gap between deep clustering and ensemble clustering by harnessing the power of multiple layers in deep neural networks.
Experimental results on six image datasets confirm the advantages of DeepCluE over the state-of-the-art deep clustering approaches.
arXiv Detail & Related papers (2022-06-01T09:51:38Z) - Graph Contrastive Clustering [131.67881457114316]
We propose a novel graph contrastive learning framework, which is then applied to the clustering task and we come up with the Graph Constrastive Clustering(GCC) method.
Specifically, on the one hand, the graph Laplacian based contrastive loss is proposed to learn more discriminative and clustering-friendly features.
On the other hand, a novel graph-based contrastive learning strategy is proposed to learn more compact clustering assignments.
arXiv Detail & Related papers (2021-04-03T15:32:49Z) - Scalable Hierarchical Agglomerative Clustering [65.66407726145619]
Existing scalable hierarchical clustering methods sacrifice quality for speed.
We present a scalable, agglomerative method for hierarchical clustering that does not sacrifice quality and scales to billions of data points.
arXiv Detail & Related papers (2020-10-22T15:58:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.