Class-Difficulty Based Methods for Long-Tailed Visual Recognition
- URL: http://arxiv.org/abs/2207.14499v1
- Date: Fri, 29 Jul 2022 06:33:22 GMT
- Title: Class-Difficulty Based Methods for Long-Tailed Visual Recognition
- Authors: Saptarshi Sinha and Hiroki Ohashi and Katsuyuki Nakamura
- Abstract summary: Long-tailed datasets are frequently encountered in real-world use cases where few classes or categories have higher number of data samples compared to the other classes.
We propose a novel approach to dynamically measure the instantaneous difficulty of each class during the training phase of the model.
We also use the difficulty measures of each class to design a novel weighted loss technique called class-wise difficulty based weighted' and a novel data sampling technique called class-wise difficulty based sampling'
- Score: 6.875312133832079
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Long-tailed datasets are very frequently encountered in real-world use cases
where few classes or categories (known as majority or head classes) have higher
number of data samples compared to the other classes (known as minority or tail
classes). Training deep neural networks on such datasets gives results biased
towards the head classes. So far, researchers have come up with multiple
weighted loss and data re-sampling techniques in efforts to reduce the bias.
However, most of such techniques assume that the tail classes are always the
most difficult classes to learn and therefore need more weightage or attention.
Here, we argue that the assumption might not always hold true. Therefore, we
propose a novel approach to dynamically measure the instantaneous difficulty of
each class during the training phase of the model. Further, we use the
difficulty measures of each class to design a novel weighted loss technique
called `class-wise difficulty based weighted (CDB-W) loss' and a novel data
sampling technique called `class-wise difficulty based sampling (CDB-S)'. To
verify the wide-scale usability of our CDB methods, we conducted extensive
experiments on multiple tasks such as image classification, object detection,
instance segmentation and video-action classification. Results verified that
CDB-W loss and CDB-S could achieve state-of-the-art results on many
class-imbalanced datasets such as ImageNet-LT, LVIS and EGTEA, that resemble
real-world use cases.
Related papers
- Difficulty-Net: Learning to Predict Difficulty for Long-Tailed
Recognition [5.977483447975081]
We propose Difficulty-Net, which learns to predict the difficulty of classes using the model's performance in a meta-learning framework.
We introduce two key concepts, namely the relative difficulty and the driver loss.
Experiments on popular long-tailed datasets demonstrated the effectiveness of the proposed method.
arXiv Detail & Related papers (2022-09-07T07:04:08Z) - CMW-Net: Learning a Class-Aware Sample Weighting Mapping for Robust Deep
Learning [55.733193075728096]
Modern deep neural networks can easily overfit to biased training data containing corrupted labels or class imbalance.
Sample re-weighting methods are popularly used to alleviate this data bias issue.
We propose a meta-model capable of adaptively learning an explicit weighting scheme directly from data.
arXiv Detail & Related papers (2022-02-11T13:49:51Z) - Relieving Long-tailed Instance Segmentation via Pairwise Class Balance [85.53585498649252]
Long-tailed instance segmentation is a challenging task due to the extreme imbalance of training samples among classes.
It causes severe biases of the head classes (with majority samples) against the tailed ones.
We propose a novel Pairwise Class Balance (PCB) method, built upon a confusion matrix which is updated during training to accumulate the ongoing prediction preferences.
arXiv Detail & Related papers (2022-01-08T07:48:36Z) - CvS: Classification via Segmentation For Small Datasets [52.821178654631254]
This paper presents CvS, a cost-effective classifier for small datasets that derives the classification labels from predicting the segmentation maps.
We evaluate the effectiveness of our framework on diverse problems showing that CvS is able to achieve much higher classification results compared to previous methods when given only a handful of examples.
arXiv Detail & Related papers (2021-10-29T18:41:15Z) - Bridging Non Co-occurrence with Unlabeled In-the-wild Data for
Incremental Object Detection [56.22467011292147]
Several incremental learning methods are proposed to mitigate catastrophic forgetting for object detection.
Despite the effectiveness, these methods require co-occurrence of the unlabeled base classes in the training data of the novel classes.
We propose the use of unlabeled in-the-wild data to bridge the non-occurrence caused by the missing base classes during the training of additional novel classes.
arXiv Detail & Related papers (2021-10-28T10:57:25Z) - Distributional Robustness Loss for Long-tail Learning [20.800627115140465]
Real-world data is often unbalanced and long-tailed, but deep models struggle to recognize rare classes in the presence of frequent classes.
We show that the feature extractor part of deep networks suffers greatly from this bias.
We propose a new loss based on robustness theory, which encourages the model to learn high-quality representations for both head and tail classes.
arXiv Detail & Related papers (2021-04-07T11:34:04Z) - Theoretical Insights Into Multiclass Classification: A High-dimensional
Asymptotic View [82.80085730891126]
We provide the first modernally precise analysis of linear multiclass classification.
Our analysis reveals that the classification accuracy is highly distribution-dependent.
The insights gained may pave the way for a precise understanding of other classification algorithms.
arXiv Detail & Related papers (2020-11-16T05:17:29Z) - Class-Wise Difficulty-Balanced Loss for Solving Class-Imbalance [6.875312133832079]
We propose a novel loss function named Class-wise Difficulty-Balanced loss.
It dynamically distributes weights to each sample according to the difficulty of the class that the sample belongs to.
The results show that CDB loss consistently outperforms the recently proposed loss functions on class-imbalanced datasets.
arXiv Detail & Related papers (2020-10-05T07:19:19Z) - The Devil is the Classifier: Investigating Long Tail Relation
Classification with Decoupling Analysis [36.298869931803836]
Long-tailed relation classification is a challenging problem as the head classes may dominate the training phase.
We propose a robust classifier with attentive relation routing, which assigns soft weights by automatically aggregating the relations.
arXiv Detail & Related papers (2020-09-15T12:47:00Z) - The Devil is in Classification: A Simple Framework for Long-tail Object
Detection and Instance Segmentation [93.17367076148348]
We investigate performance drop of the state-of-the-art two-stage instance segmentation model Mask R-CNN on the recent long-tail LVIS dataset.
We unveil that a major cause is the inaccurate classification of object proposals.
We propose a simple calibration framework to more effectively alleviate classification head bias with a bi-level class balanced sampling approach.
arXiv Detail & Related papers (2020-07-23T12:49:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.