Class-Balanced Distillation for Long-Tailed Visual Recognition
- URL: http://arxiv.org/abs/2104.05279v1
- Date: Mon, 12 Apr 2021 08:21:03 GMT
- Title: Class-Balanced Distillation for Long-Tailed Visual Recognition
- Authors: Ahmet Iscen, Andr\'e Araujo, Boqing Gong, Cordelia Schmid
- Abstract summary: Real-world imagery is often characterized by a significant imbalance of the number of images per class, leading to long-tailed distributions.
In this work, we introduce a new framework, by making the key observation that a feature representation learned with instance sampling is far from optimal in a long-tailed setting.
Our main contribution is a new training method, that leverages knowledge distillation to enhance feature representations.
- Score: 100.10293372607222
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Real-world imagery is often characterized by a significant imbalance of the
number of images per class, leading to long-tailed distributions. An effective
and simple approach to long-tailed visual recognition is to learn feature
representations and a classifier separately, with instance and class-balanced
sampling, respectively. In this work, we introduce a new framework, by making
the key observation that a feature representation learned with instance
sampling is far from optimal in a long-tailed setting. Our main contribution is
a new training method, referred to as Class-Balanced Distillation (CBD), that
leverages knowledge distillation to enhance feature representations. CBD allows
the feature representation to evolve in the second training stage, guided by
the teacher learned in the first stage. The second stage uses class-balanced
sampling, in order to focus on under-represented classes. This framework can
naturally accommodate the usage of multiple teachers, unlocking the information
from an ensemble of models to enhance recognition capabilities. Our experiments
show that the proposed technique consistently outperforms the state of the art
on long-tailed recognition benchmarks such as ImageNet-LT, iNaturalist17 and
iNaturalist18. The experiments also show that our method does not sacrifice the
accuracy of head classes to improve the performance of tail classes, unlike
most existing work.
Related papers
- LCReg: Long-Tailed Image Classification with Latent Categories based
Recognition [81.5551335554507]
We propose the Latent Categories based long-tail Recognition (LCReg) method.
Our hypothesis is that common latent features shared by head and tail classes can be used to improve feature representation.
Specifically, we learn a set of class-agnostic latent features shared by both head and tail classes, and then use semantic data augmentation on the latent features to implicitly increase the diversity of the training sample.
arXiv Detail & Related papers (2023-09-13T02:03:17Z) - Constructing Balance from Imbalance for Long-tailed Image Recognition [50.6210415377178]
The imbalance between majority (head) classes and minority (tail) classes severely skews the data-driven deep neural networks.
Previous methods tackle with data imbalance from the viewpoints of data distribution, feature space, and model design.
We propose a concise paradigm by progressively adjusting label space and dividing the head classes and tail classes.
Our proposed model also provides a feature evaluation method and paves the way for long-tailed feature learning.
arXiv Detail & Related papers (2022-08-04T10:22:24Z) - DBN-Mix: Training Dual Branch Network Using Bilateral Mixup Augmentation
for Long-Tailed Visual Recognition [7.94190631530826]
We develop a simple yet effective method to improve the performance of DBN without cumulative learning.
We present class-conditional temperature scaling that mitigates bias toward the majority class for the proposed DBN architecture.
arXiv Detail & Related papers (2022-07-05T17:01:27Z) - Long-tailed Recognition by Learning from Latent Categories [70.6272114218549]
We introduce a Latent Categories based long-tail Recognition (LCReg) method.
Specifically, we learn a set of class-agnostic latent features shared among the head and tail classes.
Then, we implicitly enrich the training sample diversity via applying semantic data augmentation to the latent features.
arXiv Detail & Related papers (2022-06-02T12:19:51Z) - Long-tail Recognition via Compositional Knowledge Transfer [60.03764547406601]
We introduce a novel strategy for long-tail recognition that addresses the tail classes' few-shot problem.
Our objective is to transfer knowledge acquired from information-rich common classes to semantically similar, and yet data-hungry, rare classes.
Experiments show that our approach can achieve significant performance boosts on rare classes while maintaining robust common class performance.
arXiv Detail & Related papers (2021-12-13T15:48:59Z) - Improving Tail-Class Representation with Centroid Contrastive Learning [145.73991900239017]
We propose interpolative centroid contrastive learning (ICCL) to improve long-tailed representation learning.
ICCL interpolates two images from a class-agnostic sampler and a class-aware sampler, and trains the model such that the representation of the ICCL can be used to retrieve the centroids for both source classes.
Our result shows a significant accuracy gain of 2.8% on the iNaturalist 2018 dataset with a real-world long-tailed distribution.
arXiv Detail & Related papers (2021-10-19T15:24:48Z) - Balanced Knowledge Distillation for Long-tailed Learning [9.732397447057318]
Deep models trained on long-tailed datasets exhibit unsatisfactory performance on tail classes.
Existing methods usually modify the classification loss to increase the learning focus on tail classes.
We propose Balanced Knowledge Distillation to disentangle the contradiction between the two goals and achieve both simultaneously.
arXiv Detail & Related papers (2021-04-21T13:07:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.