Influence-Balanced Loss for Imbalanced Visual Classification
- URL: http://arxiv.org/abs/2110.02444v1
- Date: Wed, 6 Oct 2021 01:12:40 GMT
- Title: Influence-Balanced Loss for Imbalanced Visual Classification
- Authors: Seulki Park, Jongin Lim, Younghan Jeon, Jin Young Choi
- Abstract summary: We derive a new loss used in the balancing training phase that alleviates the influence of samples that cause an overfitted decision boundary.
In experiments on multiple benchmark data sets, we demonstrate the validity of our method and reveal that the proposed loss outperforms the state-of-the-art cost-sensitive loss methods.
- Score: 9.958715010698157
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we propose a balancing training method to address problems in
imbalanced data learning. To this end, we derive a new loss used in the
balancing training phase that alleviates the influence of samples that cause an
overfitted decision boundary. The proposed loss efficiently improves the
performance of any type of imbalance learning methods. In experiments on
multiple benchmark data sets, we demonstrate the validity of our method and
reveal that the proposed loss outperforms the state-of-the-art cost-sensitive
loss methods. Furthermore, since our loss is not restricted to a specific task,
model, or training method, it can be easily used in combination with other
recent re-sampling, meta-learning, and cost-sensitive learning methods for
class-imbalance problems.
Related papers
- Gradient Reweighting: Towards Imbalanced Class-Incremental Learning [8.438092346233054]
Class-Incremental Learning (CIL) trains a model to continually recognize new classes from non-stationary data.
A major challenge of CIL arises when applying to real-world data characterized by non-uniform distribution.
We show that this dual imbalance issue causes skewed gradient updates with biased weights in FC layers, thus inducing over/under-fitting and catastrophic forgetting in CIL.
arXiv Detail & Related papers (2024-02-28T18:08:03Z) - Simplifying Neural Network Training Under Class Imbalance [77.39968702907817]
Real-world datasets are often highly class-imbalanced, which can adversely impact the performance of deep learning models.
The majority of research on training neural networks under class imbalance has focused on specialized loss functions, sampling techniques, or two-stage training procedures.
We demonstrate that simply tuning existing components of standard deep learning pipelines, such as the batch size, data augmentation, and label smoothing, can achieve state-of-the-art performance without any such specialized class imbalance methods.
arXiv Detail & Related papers (2023-12-05T05:52:44Z) - A Unified Generalization Analysis of Re-Weighting and Logit-Adjustment
for Imbalanced Learning [129.63326990812234]
We propose a technique named data-dependent contraction to capture how modified losses handle different classes.
On top of this technique, a fine-grained generalization bound is established for imbalanced learning, which helps reveal the mystery of re-weighting and logit-adjustment.
arXiv Detail & Related papers (2023-10-07T09:15:08Z) - Scaling of Class-wise Training Losses for Post-hoc Calibration [6.0632746602205865]
We propose a new calibration method to synchronize the class-wise training losses.
We design a new training loss to alleviate the variance of class-wise training losses by using multiple class-wise scaling factors.
We validate the proposed framework by employing it in the various post-hoc calibration methods.
arXiv Detail & Related papers (2023-06-19T14:59:37Z) - Effective Decision Boundary Learning for Class Incremental Learning [17.716035569936384]
Rehearsal approaches in class incremental learning (CIL) suffer from decision boundary overfitting to new classes.
We present a simple but effective approach to tackle these two factors.
Experiments show that the proposedL achieves the proposedL state-of-the-art performances on several CIL benchmarks.
arXiv Detail & Related papers (2023-01-12T18:04:51Z) - Learning to Re-weight Examples with Optimal Transport for Imbalanced
Classification [74.62203971625173]
Imbalanced data pose challenges for deep learning based classification models.
One of the most widely-used approaches for tackling imbalanced data is re-weighting.
We propose a novel re-weighting method based on optimal transport (OT) from a distributional point of view.
arXiv Detail & Related papers (2022-08-05T01:23:54Z) - Continual Learning For On-Device Environmental Sound Classification [63.81276321857279]
We propose a simple and efficient continual learning method for on-device environmental sound classification.
Our method selects the historical data for the training by measuring the per-sample classification uncertainty.
arXiv Detail & Related papers (2022-07-15T12:13:04Z) - CMW-Net: Learning a Class-Aware Sample Weighting Mapping for Robust Deep
Learning [55.733193075728096]
Modern deep neural networks can easily overfit to biased training data containing corrupted labels or class imbalance.
Sample re-weighting methods are popularly used to alleviate this data bias issue.
We propose a meta-model capable of adaptively learning an explicit weighting scheme directly from data.
arXiv Detail & Related papers (2022-02-11T13:49:51Z) - Which Strategies Matter for Noisy Label Classification? Insight into
Loss and Uncertainty [7.20844895799647]
Label noise is a critical factor that degrades the generalization performance of deep neural networks.
We present analytical results on how loss and uncertainty values of samples change throughout the training process.
We design a new robust training method that emphasizes clean and informative samples, while minimizing the influence of noise.
arXiv Detail & Related papers (2020-08-14T07:34:32Z) - Step-Ahead Error Feedback for Distributed Training with Compressed
Gradient [99.42912552638168]
We show that a new "gradient mismatch" problem is raised by the local error feedback in centralized distributed training.
We propose two novel techniques, 1) step ahead and 2) error averaging, with rigorous theoretical analysis.
arXiv Detail & Related papers (2020-08-13T11:21:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.