Meta Transition Adaptation for Robust Deep Learning with Noisy Labels
- URL: http://arxiv.org/abs/2006.05697v2
- Date: Fri, 12 Jun 2020 01:18:07 GMT
- Title: Meta Transition Adaptation for Robust Deep Learning with Noisy Labels
- Authors: Jun Shu, Qian Zhao, Zongben Xu, Deyu Meng
- Abstract summary: This study proposes a new meta-transition-learning strategy for the task.
Specifically, through the sound guidance of a small set of meta data with clean labels, the noise transition matrix and the classifier parameters can be mutually ameliorated.
Our method can more accurately extract the transition matrix, naturally following its more robust performance than prior arts.
- Score: 61.8970957519509
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: To discover intrinsic inter-class transition probabilities underlying data,
learning with noise transition has become an important approach for robust deep
learning on corrupted labels. Prior methods attempt to achieve such transition
knowledge by pre-assuming strongly confident anchor points with 1-probability
belonging to a specific class, generally infeasible in practice, or directly
jointly estimating the transition matrix and learning the classifier from the
noisy samples, always leading to inaccurate estimation misguided by wrong
annotation information especially in large noise cases. To alleviate these
issues, this study proposes a new meta-transition-learning strategy for the
task. Specifically, through the sound guidance of a small set of meta data with
clean labels, the noise transition matrix and the classifier parameters can be
mutually ameliorated to avoid being trapped by noisy training samples, and
without need of any anchor point assumptions. Besides, we prove our method is
with statistical consistency guarantee on correctly estimating the desired
transition matrix. Extensive synthetic and real experiments validate that our
method can more accurately extract the transition matrix, naturally following
its more robust performance than prior arts. Its essential relationship with
label distribution learning is also discussed, which explains its fine
performance even under no-noise scenarios.
Related papers
- Multi-Label Noise Transition Matrix Estimation with Label Correlations:
Theory and Algorithm [73.94839250910977]
Noisy multi-label learning has garnered increasing attention due to the challenges posed by collecting large-scale accurate labels.
The introduction of transition matrices can help model multi-label noise and enable the development of statistically consistent algorithms.
We propose a novel estimator that leverages label correlations without the need for anchor points or precise fitting of noisy class posteriors.
arXiv Detail & Related papers (2023-09-22T08:35:38Z) - Robust Meta-learning with Sampling Noise and Label Noise via
Eigen-Reptile [78.1212767880785]
meta-learner is prone to overfitting since there are only a few available samples.
When handling the data with noisy labels, the meta-learner could be extremely sensitive to label noise.
We present Eigen-Reptile (ER) that updates the meta- parameters with the main direction of historical task-specific parameters.
arXiv Detail & Related papers (2022-06-04T08:48:02Z) - Beyond Images: Label Noise Transition Matrix Estimation for Tasks with
Lower-Quality Features [13.659465403114766]
We propose a practical information-theoretic approach to down-weight the less informative parts of the lower-quality features.
We prove that the celebrated $f$-mutual information measure can often preserve the order when calculated using noisy labels.
arXiv Detail & Related papers (2022-02-02T20:36:09Z) - Learning Noise Transition Matrix from Only Noisy Labels via Total
Variation Regularization [88.91872713134342]
We propose a theoretically grounded method that can estimate the noise transition matrix and learn a classifier simultaneously.
We show the effectiveness of the proposed method through experiments on benchmark and real-world datasets.
arXiv Detail & Related papers (2021-02-04T05:09:18Z) - Provably End-to-end Label-Noise Learning without Anchor Points [118.97592870124937]
We propose an end-to-end framework for solving label-noise learning without anchor points.
Our proposed framework can identify the transition matrix if the clean class-posterior probabilities are sufficiently scattered.
arXiv Detail & Related papers (2021-02-04T03:59:37Z) - Extended T: Learning with Mixed Closed-set and Open-set Noisy Labels [86.5943044285146]
The label noise transition matrix $T$ reflects the probabilities that true labels flip into noisy ones.
In this paper, we focus on learning under the mixed closed-set and open-set label noise.
Our method can better model the mixed label noise, following its more robust performance than the prior state-of-the-art label-noise learning methods.
arXiv Detail & Related papers (2020-12-02T02:42:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.