Unsupervised Feature Learning by Cross-Level Instance-Group
Discrimination
- URL: http://arxiv.org/abs/2008.03813v5
- Date: Sun, 16 May 2021 03:11:23 GMT
- Title: Unsupervised Feature Learning by Cross-Level Instance-Group
Discrimination
- Authors: Xudong Wang, Ziwei Liu, Stella X. Yu
- Abstract summary: We integrate between-instance similarity into contrastive learning, not directly by instance grouping, but by cross-level discrimination.
CLD effectively brings unsupervised learning closer to natural data and real-world applications.
New state-of-the-art on self-supervision, semi-supervision, and transfer learning benchmarks, and beats MoCo v2 and SimCLR on every reported performance.
- Score: 68.83098015578874
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Unsupervised feature learning has made great strides with contrastive
learning based on instance discrimination and invariant mapping, as benchmarked
on curated class-balanced datasets. However, natural data could be highly
correlated and long-tail distributed. Natural between-instance similarity
conflicts with the presumed instance distinction, causing unstable training and
poor performance.
Our idea is to discover and integrate between-instance similarity into
contrastive learning, not directly by instance grouping, but by cross-level
discrimination (CLD) between instances and local instance groups. While
invariant mapping of each instance is imposed by attraction within its
augmented views, between-instance similarity could emerge from common repulsion
against instance groups.
Our batch-wise and cross-view comparisons also greatly improve the
positive/negative sample ratio of contrastive learning and achieve better
invariant mapping. To effect both grouping and discrimination objectives, we
impose them on features separately derived from a shared representation. In
addition, we propose normalized projection heads and unsupervised
hyper-parameter tuning for the first time.
Our extensive experimentation demonstrates that CLD is a lean and powerful
add-on to existing methods such as NPID, MoCo, InfoMin, and BYOL on highly
correlated, long-tail, or balanced datasets. It not only achieves new
state-of-the-art on self-supervision, semi-supervision, and transfer learning
benchmarks, but also beats MoCo v2 and SimCLR on every reported performance
attained with a much larger compute. CLD effectively brings unsupervised
learning closer to natural data and real-world applications. Our code is
publicly available at: https://github.com/frank-xwang/CLD-UnsupervisedLearning.
Related papers
- Decoupled Contrastive Learning for Long-Tailed Recognition [58.255966442426484]
Supervised Contrastive Loss (SCL) is popular in visual representation learning.
In the scenario of long-tailed recognition, where the number of samples in each class is imbalanced, treating two types of positive samples equally leads to the biased optimization for intra-category distance.
We propose a patch-based self distillation to transfer knowledge from head to tail classes to relieve the under-representation of tail classes.
arXiv Detail & Related papers (2024-03-10T09:46:28Z) - FedUV: Uniformity and Variance for Heterogeneous Federated Learning [5.9330433627374815]
Federated learning is a promising framework to train neural networks with widely distributed data.
Recent work has shown this is due to the final layer of the network being most prone to local bias.
We investigate the training dynamics of the classifier by applying SVD to the weights motivated by the observation that freezing weights results in constant singular values.
arXiv Detail & Related papers (2024-02-27T15:53:15Z) - Semantic Positive Pairs for Enhancing Visual Representation Learning of Instance Discrimination methods [4.680881326162484]
Self-supervised learning algorithms (SSL) based on instance discrimination have shown promising results.
We propose an approach to identify those images with similar semantic content and treat them as positive instances.
We run experiments on three benchmark datasets: ImageNet, STL-10 and CIFAR-10 with different instance discrimination SSL approaches.
arXiv Detail & Related papers (2023-06-28T11:47:08Z) - Beyond Instance Discrimination: Relation-aware Contrastive
Self-supervised Learning [75.46664770669949]
We present relation-aware contrastive self-supervised learning (ReCo) to integrate instance relations.
Our ReCo consistently gains remarkable performance improvements.
arXiv Detail & Related papers (2022-11-02T03:25:28Z) - Non-contrastive representation learning for intervals from well logs [58.70164460091879]
The representation learning problem in the oil & gas industry aims to construct a model that provides a representation based on logging data for a well interval.
One of the possible approaches is self-supervised learning (SSL)
We are the first to introduce non-contrastive SSL for well-logging data.
arXiv Detail & Related papers (2022-09-28T13:27:10Z) - Adaptive Soft Contrastive Learning [19.45520684918576]
This paper proposes an adaptive method that introduces soft inter-sample relations, namely Adaptive Soft Contrastive Learning (ASCL)
As an effective and concise plug-in module for existing self-supervised learning frameworks, ASCL achieves the best performance on several benchmarks.
arXiv Detail & Related papers (2022-07-22T16:01:07Z) - Weakly Supervised Contrastive Learning [68.47096022526927]
We introduce a weakly supervised contrastive learning framework (WCL) to tackle this issue.
WCL achieves 65% and 72% ImageNet Top-1 Accuracy using ResNet50, which is even higher than SimCLRv2 with ResNet101.
arXiv Detail & Related papers (2021-10-10T12:03:52Z) - Hybrid Dynamic Contrast and Probability Distillation for Unsupervised
Person Re-Id [109.1730454118532]
Unsupervised person re-identification (Re-Id) has attracted increasing attention due to its practical application in the read-world video surveillance system.
We present the hybrid dynamic cluster contrast and probability distillation algorithm.
It formulates the unsupervised Re-Id problem into an unified local-to-global dynamic contrastive learning and self-supervised probability distillation framework.
arXiv Detail & Related papers (2021-09-29T02:56:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.