Rebalanced Siamese Contrastive Mining for Long-Tailed Recognition
- URL: http://arxiv.org/abs/2203.11506v1
- Date: Tue, 22 Mar 2022 07:30:38 GMT
- Title: Rebalanced Siamese Contrastive Mining for Long-Tailed Recognition
- Authors: Zhisheng Zhong, Jiequan Cui, Eric Lo, Zeming Li, Jian Sun, Jiaya Jia
- Abstract summary: We show that supervised contrastive learning suffers a dual class-imbalance problem at both the original batch and Siamese batch levels.
We propose supervised hard positive and negative pairs mining to pick up informative pairs for contrastive computation and improve representation learning.
- Score: 120.80038161330623
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep neural networks perform poorly on heavily class-imbalanced datasets.
Given the promising performance of contrastive learning, we propose
$\mathbf{Re}$balanced $\mathbf{S}$iamese $\mathbf{Co}$ntrastive
$\mathbf{m}$ining ( $\mathbf{ResCom}$) to tackle imbalanced recognition. Based
on the mathematical analysis and simulation results, we claim that supervised
contrastive learning suffers a dual class-imbalance problem at both the
original batch and Siamese batch levels, which is more serious than long-tailed
classification learning. In this paper, at the original batch level, we
introduce a class-balanced supervised contrastive loss to assign adaptive
weights for different classes. At the Siamese batch level, we present a
class-balanced queue, which maintains the same number of keys for all classes.
Furthermore, we note that the contrastive loss gradient with respect to the
contrastive logits can be decoupled into the positives and negatives, and easy
positives and easy negatives will make the contrastive gradient vanish. We
propose supervised hard positive and negative pairs mining to pick up
informative pairs for contrastive computation and improve representation
learning. Finally, to approximately maximize the mutual information between the
two views, we propose Siamese Balanced Softmax and joint it with the
contrastive loss for one-stage training. ResCom outperforms the previous
methods by large margins on multiple long-tailed recognition benchmarks. Our
code will be made publicly available at:
https://github.com/dvlab-research/ResCom.
Related papers
- Decoupled Contrastive Learning for Long-Tailed Recognition [58.255966442426484]
Supervised Contrastive Loss (SCL) is popular in visual representation learning.
In the scenario of long-tailed recognition, where the number of samples in each class is imbalanced, treating two types of positive samples equally leads to the biased optimization for intra-category distance.
We propose a patch-based self distillation to transfer knowledge from head to tail classes to relieve the under-representation of tail classes.
arXiv Detail & Related papers (2024-03-10T09:46:28Z) - Mixup Your Own Pairs [22.882694278940598]
We argue that the potential of contrastive learning for regression has been overshadowed due to the neglect of two crucial aspects: ordinality-awareness and hardness.
Specifically, we propose Supervised Contrastive Learning for Regression with Mixup (SupReMix)
It takes anchor-inclusive mixtures (mixup of the anchor and a distinct negative sample) as hard negative pairs and anchor-exclusive mixtures (mixup of two distinct negative samples) as hard positive pairs at the embedding level.
arXiv Detail & Related papers (2023-09-28T17:38:59Z) - Class Instance Balanced Learning for Long-Tailed Classification [0.0]
Long-tailed image classification task deals with large imbalances in the class frequencies of the training data.
Previous approaches have shown that combining cross-entropy and contrastive learning can improve performance on the long-tailed task.
We propose a novel class instance balanced loss (CIBL), which reweights the relative contributions of a cross-entropy and a contrastive loss as a function of the frequency of class instances in the training batch.
arXiv Detail & Related papers (2023-07-11T15:09:10Z) - Balanced Contrastive Learning for Long-Tailed Visual Recognition [32.789465918318925]
Real-world data typically follow a long-tailed distribution, where a few majority categories occupy most of the data.
In this paper, we focus on representation learning for imbalanced data.
We propose a novel loss for balanced contrastive learning (BCL)
arXiv Detail & Related papers (2022-07-19T03:48:59Z) - Relieving Long-tailed Instance Segmentation via Pairwise Class Balance [85.53585498649252]
Long-tailed instance segmentation is a challenging task due to the extreme imbalance of training samples among classes.
It causes severe biases of the head classes (with majority samples) against the tailed ones.
We propose a novel Pairwise Class Balance (PCB) method, built upon a confusion matrix which is updated during training to accumulate the ongoing prediction preferences.
arXiv Detail & Related papers (2022-01-08T07:48:36Z) - Modality-Aware Triplet Hard Mining for Zero-shot Sketch-Based Image
Retrieval [51.42470171051007]
This paper tackles the Zero-Shot Sketch-Based Image Retrieval (ZS-SBIR) problem from the viewpoint of cross-modality metric learning.
By combining two fundamental learning approaches in DML, e.g., classification training and pairwise training, we set up a strong baseline for ZS-SBIR.
We show that Modality-Aware Triplet Hard Mining (MATHM) enhances the baseline with three types of pairwise learning.
arXiv Detail & Related papers (2021-12-15T08:36:44Z) - You Only Need End-to-End Training for Long-Tailed Recognition [8.789819609485225]
Cross-entropy loss tends to produce highly correlated features on imbalanced data.
We propose two novel modules, Block-based Relatively Balanced Batch Sampler (B3RS) and Batch Embedded Training (BET)
Experimental results on the long-tailed classification benchmarks, CIFAR-LT and ImageNet-LT, demonstrate the effectiveness of our method.
arXiv Detail & Related papers (2021-12-11T11:44:09Z) - Parametric Contrastive Learning [65.70554597097248]
We propose Parametric Contrastive Learning (PaCo) to tackle long-tailed recognition.
PaCo can adaptively enhance the intensity of pushing samples of the same class close.
Experiments on long-tailed CIFAR, ImageNet, Places, and iNaturalist 2018 manifest the new state-of-the-art for long-tailed recognition.
arXiv Detail & Related papers (2021-07-26T08:37:23Z) - Mitigating Dataset Imbalance via Joint Generation and Classification [17.57577266707809]
Supervised deep learning methods are enjoying enormous success in many practical applications of computer vision.
The marked performance degradation to biases and imbalanced data questions the reliability of these methods.
We introduce a joint dataset repairment strategy by combining a neural network classifier with Generative Adversarial Networks (GAN)
We show that the combined training helps to improve the robustness of both the classifier and the GAN against severe class imbalance.
arXiv Detail & Related papers (2020-08-12T18:40:38Z) - Long-Tailed Recognition Using Class-Balanced Experts [128.73438243408393]
We propose an ensemble of class-balanced experts that combines the strength of diverse classifiers.
Our ensemble of class-balanced experts reaches results close to state-of-the-art and an extended ensemble establishes a new state-of-the-art on two benchmarks for long-tailed recognition.
arXiv Detail & Related papers (2020-04-07T20:57:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.