Long-tailed Recognition by Learning from Latent Categories
- URL: http://arxiv.org/abs/2206.01010v1
- Date: Thu, 2 Jun 2022 12:19:51 GMT
- Title: Long-tailed Recognition by Learning from Latent Categories
- Authors: Weide Liu, Zhonghua Wu, Yiming Wang, Henghui Ding, Fayao Liu, Jie Lin
and Guosheng Lin
- Abstract summary: We introduce a Latent Categories based long-tail Recognition (LCReg) method.
Specifically, we learn a set of class-agnostic latent features shared among the head and tail classes.
Then, we implicitly enrich the training sample diversity via applying semantic data augmentation to the latent features.
- Score: 70.6272114218549
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In this work, we address the challenging task of long-tailed image
recognition. Previous long-tailed recognition methods commonly focus on the
data augmentation or re-balancing strategy of the tail classes to give more
attention to tail classes during the model training. However, due to the
limited training images for tail classes, the diversity of tail class images is
still restricted, which results in poor feature representations. In this work,
we hypothesize that common latent features among the head and tail classes can
be used to give better feature representation. Motivated by this, we introduce
a Latent Categories based long-tail Recognition (LCReg) method. Specifically,
we propose to learn a set of class-agnostic latent features shared among the
head and tail classes. Then, we implicitly enrich the training sample diversity
via applying semantic data augmentation to the latent features. Extensive
experiments on five long-tailed image recognition datasets demonstrate that our
proposed LCReg is able to significantly outperform previous methods and achieve
state-of-the-art results.
Related papers
- Granularity Matters in Long-Tail Learning [62.30734737735273]
We offer a novel perspective on long-tail learning, inspired by an observation: datasets with finer granularity tend to be less affected by data imbalance.
We introduce open-set auxiliary classes that are visually similar to existing ones, aiming to enhance representation learning for both head and tail classes.
To prevent the overwhelming presence of auxiliary classes from disrupting training, we introduce a neighbor-silencing loss.
arXiv Detail & Related papers (2024-10-21T13:06:21Z) - LCReg: Long-Tailed Image Classification with Latent Categories based
Recognition [81.5551335554507]
We propose the Latent Categories based long-tail Recognition (LCReg) method.
Our hypothesis is that common latent features shared by head and tail classes can be used to improve feature representation.
Specifically, we learn a set of class-agnostic latent features shared by both head and tail classes, and then use semantic data augmentation on the latent features to implicitly increase the diversity of the training sample.
arXiv Detail & Related papers (2023-09-13T02:03:17Z) - Towards Long-Tailed Recognition for Graph Classification via
Collaborative Experts [10.99232053983369]
We propose a novel long-tailed graph-level classification framework via Collaborative Multi-expert Learning (CoMe)
To equilibrate the contributions of head and tail classes, we first develop balanced contrastive learning from the view of representation learning.
We execute gated fusion and disentangled knowledge distillation among the multiple experts to promote the collaboration in a multi-expert framework.
arXiv Detail & Related papers (2023-08-31T10:12:32Z) - SuperDisco: Super-Class Discovery Improves Visual Recognition for the
Long-Tail [69.50380510879697]
We propose SuperDisco, an algorithm that discovers super-class representations for long-tailed recognition.
We learn to construct the super-class graph to guide the representation learning to deal with long-tailed distributions.
arXiv Detail & Related papers (2023-03-31T19:51:12Z) - Improving Tail-Class Representation with Centroid Contrastive Learning [145.73991900239017]
We propose interpolative centroid contrastive learning (ICCL) to improve long-tailed representation learning.
ICCL interpolates two images from a class-agnostic sampler and a class-aware sampler, and trains the model such that the representation of the ICCL can be used to retrieve the centroids for both source classes.
Our result shows a significant accuracy gain of 2.8% on the iNaturalist 2018 dataset with a real-world long-tailed distribution.
arXiv Detail & Related papers (2021-10-19T15:24:48Z) - Class-Balanced Distillation for Long-Tailed Visual Recognition [100.10293372607222]
Real-world imagery is often characterized by a significant imbalance of the number of images per class, leading to long-tailed distributions.
In this work, we introduce a new framework, by making the key observation that a feature representation learned with instance sampling is far from optimal in a long-tailed setting.
Our main contribution is a new training method, that leverages knowledge distillation to enhance feature representations.
arXiv Detail & Related papers (2021-04-12T08:21:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.