Improving Tail-Class Representation with Centroid Contrastive Learning
- URL: http://arxiv.org/abs/2110.10048v2
- Date: Thu, 4 May 2023 13:48:07 GMT
- Title: Improving Tail-Class Representation with Centroid Contrastive Learning
- Authors: Anthony Meng Huat Tiong, Junnan Li, Guosheng Lin, Boyang Li, Caiming
Xiong, Steven C.H. Hoi
- Abstract summary: We propose interpolative centroid contrastive learning (ICCL) to improve long-tailed representation learning.
ICCL interpolates two images from a class-agnostic sampler and a class-aware sampler, and trains the model such that the representation of the ICCL can be used to retrieve the centroids for both source classes.
Our result shows a significant accuracy gain of 2.8% on the iNaturalist 2018 dataset with a real-world long-tailed distribution.
- Score: 145.73991900239017
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In vision domain, large-scale natural datasets typically exhibit long-tailed
distribution which has large class imbalance between head and tail classes.
This distribution poses difficulty in learning good representations for tail
classes. Recent developments have shown good long-tailed model can be learnt by
decoupling the training into representation learning and classifier balancing.
However, these works pay insufficient consideration on the long-tailed effect
on representation learning. In this work, we propose interpolative centroid
contrastive learning (ICCL) to improve long-tailed representation learning.
ICCL interpolates two images from a class-agnostic sampler and a class-aware
sampler, and trains the model such that the representation of the interpolative
image can be used to retrieve the centroids for both source classes. We
demonstrate the effectiveness of our approach on multiple long-tailed image
classification benchmarks. Our result shows a significant accuracy gain of 2.8%
on the iNaturalist 2018 dataset with a real-world long-tailed distribution.
Related papers
- SuperDisco: Super-Class Discovery Improves Visual Recognition for the
Long-Tail [69.50380510879697]
We propose SuperDisco, an algorithm that discovers super-class representations for long-tailed recognition.
We learn to construct the super-class graph to guide the representation learning to deal with long-tailed distributions.
arXiv Detail & Related papers (2023-03-31T19:51:12Z) - Improving GANs for Long-Tailed Data through Group Spectral
Regularization [51.58250647277375]
We propose a novel group Spectral Regularizer (gSR) that prevents the spectral explosion alleviating mode collapse.
We find that gSR effectively combines with existing augmentation and regularization techniques, leading to state-of-the-art image generation performance on long-tailed data.
arXiv Detail & Related papers (2022-08-21T17:51:05Z) - Balanced Contrastive Learning for Long-Tailed Visual Recognition [32.789465918318925]
Real-world data typically follow a long-tailed distribution, where a few majority categories occupy most of the data.
In this paper, we focus on representation learning for imbalanced data.
We propose a novel loss for balanced contrastive learning (BCL)
arXiv Detail & Related papers (2022-07-19T03:48:59Z) - DBN-Mix: Training Dual Branch Network Using Bilateral Mixup Augmentation
for Long-Tailed Visual Recognition [7.94190631530826]
We develop a simple yet effective method to improve the performance of DBN without cumulative learning.
We present class-conditional temperature scaling that mitigates bias toward the majority class for the proposed DBN architecture.
arXiv Detail & Related papers (2022-07-05T17:01:27Z) - Learning Muti-expert Distribution Calibration for Long-tailed Video
Classification [88.12433458277168]
We propose an end-to-end multi-experts distribution calibration method based on two-level distribution information.
By modeling this two-level distribution information, the model can consider the head classes and the tail classes.
Our method achieves state-of-the-art performance on the long-tailed video classification task.
arXiv Detail & Related papers (2022-05-22T09:52:34Z) - Feature Generation for Long-tail Classification [36.186909933006675]
We show how to generate meaningful features by estimating the tail category's distribution.
We also present a qualitative analysis of generated features using t-SNE visualizations and analyze the nearest neighbors used to calibrate the tail class distributions.
arXiv Detail & Related papers (2021-11-10T21:34:29Z) - Calibrating Class Activation Maps for Long-Tailed Visual Recognition [60.77124328049557]
We present two effective modifications of CNNs to improve network learning from long-tailed distribution.
First, we present a Class Activation Map (CAMC) module to improve the learning and prediction of network classifiers.
Second, we investigate the use of normalized classifiers for representation learning in long-tailed problems.
arXiv Detail & Related papers (2021-08-29T05:45:03Z) - Class-Balanced Distillation for Long-Tailed Visual Recognition [100.10293372607222]
Real-world imagery is often characterized by a significant imbalance of the number of images per class, leading to long-tailed distributions.
In this work, we introduce a new framework, by making the key observation that a feature representation learned with instance sampling is far from optimal in a long-tailed setting.
Our main contribution is a new training method, that leverages knowledge distillation to enhance feature representations.
arXiv Detail & Related papers (2021-04-12T08:21:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.