Supervised Contrastive Learning on Blended Images for Long-tailed
Recognition
- URL: http://arxiv.org/abs/2211.11938v1
- Date: Tue, 22 Nov 2022 01:19:00 GMT
- Title: Supervised Contrastive Learning on Blended Images for Long-tailed
Recognition
- Authors: Minki Jeong, Changick Kim
- Abstract summary: Real-world data often have a long-tailed distribution, where the number of samples per class is not equal over training classes.
In this paper, we propose a novel long-tailed recognition method to balance the latent feature space.
- Score: 32.876647081080655
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Real-world data often have a long-tailed distribution, where the number of
samples per class is not equal over training classes. The imbalanced data form
a biased feature space, which deteriorates the performance of the recognition
model. In this paper, we propose a novel long-tailed recognition method to
balance the latent feature space. First, we introduce a MixUp-based data
augmentation technique to reduce the bias of the long-tailed data. Furthermore,
we propose a new supervised contrastive learning method, named Supervised
contrastive learning on Mixed Classes (SMC), for blended images. SMC creates a
set of positives based on the class labels of the original images. The
combination ratio of positives weights the positives in the training loss. SMC
with the class-mixture-based loss explores more diverse data space, enhancing
the generalization capability of the model. Extensive experiments on various
benchmarks show the effectiveness of our one-stage training method.
Related papers
- SMCL: Saliency Masked Contrastive Learning for Long-tailed Recognition [19.192861880360347]
We propose saliency masked contrastive learning to mitigate the problem of biased predictions.
Our key idea is to mask the important part of an image using saliency detection and use contrastive learning to move the masked image towards minor classes in the feature space.
Experiment results show that our method achieves state-of-the-art level performance on benchmark long-tailed datasets.
arXiv Detail & Related papers (2024-06-04T11:33:40Z) - Training Class-Imbalanced Diffusion Model Via Overlap Optimization [55.96820607533968]
Diffusion models trained on real-world datasets often yield inferior fidelity for tail classes.
Deep generative models, including diffusion models, are biased towards classes with abundant training images.
We propose a method based on contrastive learning to minimize the overlap between distributions of synthetic images for different classes.
arXiv Detail & Related papers (2024-02-16T16:47:21Z) - Constructing Balance from Imbalance for Long-tailed Image Recognition [50.6210415377178]
The imbalance between majority (head) classes and minority (tail) classes severely skews the data-driven deep neural networks.
Previous methods tackle with data imbalance from the viewpoints of data distribution, feature space, and model design.
We propose a concise paradigm by progressively adjusting label space and dividing the head classes and tail classes.
Our proposed model also provides a feature evaluation method and paves the way for long-tailed feature learning.
arXiv Detail & Related papers (2022-08-04T10:22:24Z) - DBN-Mix: Training Dual Branch Network Using Bilateral Mixup Augmentation
for Long-Tailed Visual Recognition [7.94190631530826]
We develop a simple yet effective method to improve the performance of DBN without cumulative learning.
We present class-conditional temperature scaling that mitigates bias toward the majority class for the proposed DBN architecture.
arXiv Detail & Related papers (2022-07-05T17:01:27Z) - Semantically Proportional Patchmix for Few-Shot Learning [16.24173112047382]
Few-shot learning aims to classify unseen classes with only a limited number of labeled data.
Recent works have demonstrated that training models with a simple transfer learning strategy can achieve competitive results in few-shot classification.
We proposeSePPMix, in which patches are cut and pasted among training images and the ground truth labels are mixed proportionally to the semantic information of the patches.
arXiv Detail & Related papers (2022-02-17T13:24:33Z) - Targeted Supervised Contrastive Learning for Long-Tailed Recognition [50.24044608432207]
Real-world data often exhibits long tail distributions with heavy class imbalance.
We show that while supervised contrastive learning can help improve performance, past baselines suffer from poor uniformity brought in by imbalanced data distribution.
We propose targeted supervised contrastive learning (TSC), which improves the uniformity of the feature distribution on the hypersphere.
arXiv Detail & Related papers (2021-11-27T22:40:10Z) - Improving Tail-Class Representation with Centroid Contrastive Learning [145.73991900239017]
We propose interpolative centroid contrastive learning (ICCL) to improve long-tailed representation learning.
ICCL interpolates two images from a class-agnostic sampler and a class-aware sampler, and trains the model such that the representation of the ICCL can be used to retrieve the centroids for both source classes.
Our result shows a significant accuracy gain of 2.8% on the iNaturalist 2018 dataset with a real-world long-tailed distribution.
arXiv Detail & Related papers (2021-10-19T15:24:48Z) - Class-Balanced Distillation for Long-Tailed Visual Recognition [100.10293372607222]
Real-world imagery is often characterized by a significant imbalance of the number of images per class, leading to long-tailed distributions.
In this work, we introduce a new framework, by making the key observation that a feature representation learned with instance sampling is far from optimal in a long-tailed setting.
Our main contribution is a new training method, that leverages knowledge distillation to enhance feature representations.
arXiv Detail & Related papers (2021-04-12T08:21:03Z) - Contrastive Learning based Hybrid Networks for Long-Tailed Image
Classification [31.647639786095993]
We propose a novel hybrid network structure composed of a supervised contrastive loss to learn image representations and a cross-entropy loss to learn classifiers.
Experiments on three long-tailed classification datasets demonstrate the advantage of the proposed contrastive learning based hybrid networks in long-tailed classification.
arXiv Detail & Related papers (2021-03-26T05:22:36Z) - Long-Tailed Recognition Using Class-Balanced Experts [128.73438243408393]
We propose an ensemble of class-balanced experts that combines the strength of diverse classifiers.
Our ensemble of class-balanced experts reaches results close to state-of-the-art and an extended ensemble establishes a new state-of-the-art on two benchmarks for long-tailed recognition.
arXiv Detail & Related papers (2020-04-07T20:57:44Z) - Imbalanced Data Learning by Minority Class Augmentation using Capsule
Adversarial Networks [31.073558420480964]
We propose a method to restore the balance in imbalanced images, by coalescing two concurrent methods.
In our model, generative and discriminative networks play a novel competitive game.
The coalescing of capsule-GAN is effective at recognizing highly overlapping classes with much fewer parameters compared with the convolutional-GAN.
arXiv Detail & Related papers (2020-04-05T12:36:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.