Probing clustering in neural network representations
- URL: http://arxiv.org/abs/2311.07864v1
- Date: Tue, 14 Nov 2023 02:33:54 GMT
- Title: Probing clustering in neural network representations
- Authors: Thao Nguyen, Simon Kornblith
- Abstract summary: We study how the many design choices involved in neural network training affect the clusters formed in the hidden representations.
We isolate the training dataset and architecture as important factors affecting clusterability.
We find that normalization strategies affect which layers yield the best clustering performance.
- Score: 30.640266399583613
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural network representations contain structure beyond what was present in
the training labels. For instance, representations of images that are visually
or semantically similar tend to lie closer to each other than to dissimilar
images, regardless of their labels. Clustering these representations can thus
provide insights into dataset properties as well as the network internals. In
this work, we study how the many design choices involved in neural network
training affect the clusters formed in the hidden representations. To do so, we
establish an evaluation setup based on the BREEDS hierarchy, for the task of
subclass clustering after training models with only superclass information. We
isolate the training dataset and architecture as important factors affecting
clusterability. Datasets with labeled classes consisting of unrelated
subclasses yield much better clusterability than those following a natural
hierarchy. When using pretrained models to cluster representations on
downstream datasets, models pretrained on subclass labels provide better
clusterability than models pretrained on superclass labels, but only when there
is a high degree of domain overlap between the pretraining and downstream data.
Architecturally, we find that normalization strategies affect which layers
yield the best clustering performance, and, surprisingly, Vision Transformers
attain lower subclass clusterability than ResNets.
Related papers
- An Empirical Study into Clustering of Unseen Datasets with Self-Supervised Encoders [34.000135361782206]
We deploy pretrained image models on datasets they were not trained for, and investigate whether their embeddings form meaningful clusters.
This evaluation provides new insights into the embeddings of self-supervised models, which prioritize different features to supervised models.
arXiv Detail & Related papers (2024-06-04T16:34:17Z) - Reinforcement Graph Clustering with Unknown Cluster Number [91.4861135742095]
We propose a new deep graph clustering method termed Reinforcement Graph Clustering.
In our proposed method, cluster number determination and unsupervised representation learning are unified into a uniform framework.
In order to conduct feedback actions, the clustering-oriented reward function is proposed to enhance the cohesion of the same clusters and separate the different clusters.
arXiv Detail & Related papers (2023-08-13T18:12:28Z) - XAI for Self-supervised Clustering of Wireless Spectrum Activity [0.5809784853115825]
We propose a methodology for explaining deep clustering, self-supervised learning architectures.
For the representation learning part, our methodology employs Guided Backpropagation to interpret the regions of interest of the input data.
For the clustering part, the methodology relies on Shallow Trees to explain the clustering result.
Finally, a data-specific visualizations part enables connection for each of the clusters to the input data trough the relevant features.
arXiv Detail & Related papers (2023-05-17T08:56:43Z) - Contrastive Hierarchical Clustering [8.068701201341065]
CoHiClust is a Contrastive Hierarchical Clustering model based on deep neural networks.
By employing a self-supervised learning approach, CoHiClust distills the base network into a binary tree without access to any labeled data.
arXiv Detail & Related papers (2023-03-03T07:54:19Z) - DeepCluE: Enhanced Image Clustering via Multi-layer Ensembles in Deep
Neural Networks [53.88811980967342]
This paper presents a Deep Clustering via Ensembles (DeepCluE) approach.
It bridges the gap between deep clustering and ensemble clustering by harnessing the power of multiple layers in deep neural networks.
Experimental results on six image datasets confirm the advantages of DeepCluE over the state-of-the-art deep clustering approaches.
arXiv Detail & Related papers (2022-06-01T09:51:38Z) - Self-supervised Contrastive Attributed Graph Clustering [110.52694943592974]
We propose a novel attributed graph clustering network, namely Self-supervised Contrastive Attributed Graph Clustering (SCAGC)
In SCAGC, by leveraging inaccurate clustering labels, a self-supervised contrastive loss, are designed for node representation learning.
For the OOS nodes, SCAGC can directly calculate their clustering labels.
arXiv Detail & Related papers (2021-10-15T03:25:28Z) - Learning Statistical Representation with Joint Deep Embedded Clustering [2.1267423178232407]
StatDEC is an unsupervised framework for joint statistical representation learning and clustering.
Our experiments show that using these representations, one can considerably improve results on imbalanced image clustering across a variety of image datasets.
arXiv Detail & Related papers (2021-09-11T09:26:52Z) - Learning Hierarchical Graph Neural Networks for Image Clustering [81.5841862489509]
We propose a hierarchical graph neural network (GNN) model that learns how to cluster a set of images into an unknown number of identities.
Our hierarchical GNN uses a novel approach to merge connected components predicted at each level of the hierarchy to form a new graph at the next level.
arXiv Detail & Related papers (2021-07-03T01:28:42Z) - Graph Contrastive Clustering [131.67881457114316]
We propose a novel graph contrastive learning framework, which is then applied to the clustering task and we come up with the Graph Constrastive Clustering(GCC) method.
Specifically, on the one hand, the graph Laplacian based contrastive loss is proposed to learn more discriminative and clustering-friendly features.
On the other hand, a novel graph-based contrastive learning strategy is proposed to learn more compact clustering assignments.
arXiv Detail & Related papers (2021-04-03T15:32:49Z) - Structured Graph Learning for Clustering and Semi-supervised
Classification [74.35376212789132]
We propose a graph learning framework to preserve both the local and global structure of data.
Our method uses the self-expressiveness of samples to capture the global structure and adaptive neighbor approach to respect the local structure.
Our model is equivalent to a combination of kernel k-means and k-means methods under certain condition.
arXiv Detail & Related papers (2020-08-31T08:41:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.