How does Contrastive Learning Organize Images?
- URL: http://arxiv.org/abs/2305.10229v2
- Date: Fri, 17 Nov 2023 19:34:39 GMT
- Title: How does Contrastive Learning Organize Images?
- Authors: Yunzhe Zhang, Yao Lu, Qi Xuan
- Abstract summary: Contrastive learning, a dominant self-supervised technique, emphasizes similarity in representations between augmentations of the same input and dissimilarity for different ones.
Recent studies challenge this direct relationship, spotlighting the crucial role of inductive biases.
We introduce the "RLD (Relative Local Density)" metric to capture this discrepancy.
- Score: 8.077578967149561
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Contrastive learning, a dominant self-supervised technique, emphasizes
similarity in representations between augmentations of the same input and
dissimilarity for different ones. Although low contrastive loss often
correlates with high classification accuracy, recent studies challenge this
direct relationship, spotlighting the crucial role of inductive biases. We
delve into these biases from a clustering viewpoint, noting that contrastive
learning creates locally dense clusters, contrasting the globally dense
clusters from supervised learning. To capture this discrepancy, we introduce
the "RLD (Relative Local Density)" metric. While this cluster property can
hinder linear classification accuracy, leveraging a Graph Convolutional Network
(GCN) based classifier mitigates this, boosting accuracy and reducing parameter
requirements. The code is available
\href{https://github.com/xsgxlz/How-does-Contrastive-Learning-Organize-Images/tree/main}{here}.
Related papers
- Towards Multi-view Graph Anomaly Detection with Similarity-Guided Contrastive Clustering [35.1801853090859]
Anomaly detection on graphs plays an important role in many real-world applications.
We propose an autoencoder-based clustering framework regularized by a similarity-guided contrastive loss to detect anomalous nodes.
arXiv Detail & Related papers (2024-09-15T15:41:59Z) - Deep Contrastive Graph Learning with Clustering-Oriented Guidance [61.103996105756394]
Graph Convolutional Network (GCN) has exhibited remarkable potential in improving graph-based clustering.
Models estimate an initial graph beforehand to apply GCN.
Deep Contrastive Graph Learning (DCGL) model is proposed for general data clustering.
arXiv Detail & Related papers (2024-02-25T07:03:37Z) - Improving Deep Representation Learning via Auxiliary Learnable Target Coding [69.79343510578877]
This paper introduces a novel learnable target coding as an auxiliary regularization of deep representation learning.
Specifically, a margin-based triplet loss and a correlation consistency loss on the proposed target codes are designed to encourage more discriminative representations.
arXiv Detail & Related papers (2023-05-30T01:38:54Z) - Convolutional Fine-Grained Classification with Self-Supervised Target
Relation Regularization [34.8793946023412]
This paper introduces a novel target coding scheme -- dynamic target relation graphs (DTRG)
Online computation of class-level feature centers is designed to generate cross-category distance in the representation space.
The proposed target graphs can alleviate data sparsity and imbalanceness in representation learning.
arXiv Detail & Related papers (2022-08-03T11:51:53Z) - Chaos is a Ladder: A New Theoretical Understanding of Contrastive
Learning via Augmentation Overlap [64.60460828425502]
We propose a new guarantee on the downstream performance of contrastive learning.
Our new theory hinges on the insight that the support of different intra-class samples will become more overlapped under aggressive data augmentations.
We propose an unsupervised model selection metric ARC that aligns well with downstream accuracy.
arXiv Detail & Related papers (2022-03-25T05:36:26Z) - Prototypical Classifier for Robust Class-Imbalanced Learning [64.96088324684683]
We propose textitPrototypical, which does not require fitting additional parameters given the embedding network.
Prototypical produces balanced and comparable predictions for all classes even though the training set is class-imbalanced.
We test our method on CIFAR-10LT, CIFAR-100LT and Webvision datasets, observing that Prototypical obtains substaintial improvements compared with state of the arts.
arXiv Detail & Related papers (2021-10-22T01:55:01Z) - Contrastive Learning based Hybrid Networks for Long-Tailed Image
Classification [31.647639786095993]
We propose a novel hybrid network structure composed of a supervised contrastive loss to learn image representations and a cross-entropy loss to learn classifiers.
Experiments on three long-tailed classification datasets demonstrate the advantage of the proposed contrastive learning based hybrid networks in long-tailed classification.
arXiv Detail & Related papers (2021-03-26T05:22:36Z) - Class-incremental Learning with Rectified Feature-Graph Preservation [24.098892115785066]
A central theme of this paper is to learn new classes that arrive in sequential phases over time.
We propose a weighted-Euclidean regularization for old knowledge preservation.
We show how it can work with binary cross-entropy to increase class separation for effective learning of new classes.
arXiv Detail & Related papers (2020-12-15T07:26:04Z) - Unsupervised Feature Learning by Cross-Level Instance-Group
Discrimination [68.83098015578874]
We integrate between-instance similarity into contrastive learning, not directly by instance grouping, but by cross-level discrimination.
CLD effectively brings unsupervised learning closer to natural data and real-world applications.
New state-of-the-art on self-supervision, semi-supervision, and transfer learning benchmarks, and beats MoCo v2 and SimCLR on every reported performance.
arXiv Detail & Related papers (2020-08-09T21:13:13Z) - Learning and Exploiting Interclass Visual Correlations for Medical Image
Classification [30.88175218665726]
We present the Class-Correlation Learning Network (CCL-Net) to learn interclass visual correlations from given training data.
Instead of letting the network directly learn the desired correlations, we propose to learn them implicitly via distance metric learning of class-specific embeddings.
An intuitive loss based on a geometrical explanation of correlation is designed for bolstering learning of the interclass correlations.
arXiv Detail & Related papers (2020-07-13T13:31:38Z) - Embedding Propagation: Smoother Manifold for Few-Shot Classification [131.81692677836202]
We propose to use embedding propagation as an unsupervised non-parametric regularizer for manifold smoothing in few-shot classification.
We empirically show that embedding propagation yields a smoother embedding manifold.
We show that embedding propagation consistently improves the accuracy of the models in multiple semi-supervised learning scenarios by up to 16% points.
arXiv Detail & Related papers (2020-03-09T13:51:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.