Learning Noise Transition Matrix from Only Noisy Labels via Total
Variation Regularization
- URL: http://arxiv.org/abs/2102.02414v1
- Date: Thu, 4 Feb 2021 05:09:18 GMT
- Title: Learning Noise Transition Matrix from Only Noisy Labels via Total
Variation Regularization
- Authors: Yivan Zhang, Gang Niu, Masashi Sugiyama
- Abstract summary: We propose a theoretically grounded method that can estimate the noise transition matrix and learn a classifier simultaneously.
We show the effectiveness of the proposed method through experiments on benchmark and real-world datasets.
- Score: 88.91872713134342
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many weakly supervised classification methods employ a noise transition
matrix to capture the class-conditional label corruption. To estimate the
transition matrix from noisy data, existing methods often need to estimate the
noisy class-posterior, which could be unreliable due to the overconfidence of
neural networks. In this work, we propose a theoretically grounded method that
can estimate the noise transition matrix and learn a classifier simultaneously,
without relying on the error-prone noisy class-posterior estimation.
Concretely, inspired by the characteristics of the stochastic label corruption
process, we propose total variation regularization, which encourages the
predicted probabilities to be more distinguishable from each other. Under mild
assumptions, the proposed method yields a consistent estimator of the
transition matrix. We show the effectiveness of the proposed method through
experiments on benchmark and real-world datasets.
Related papers
- Multi-Label Noise Transition Matrix Estimation with Label Correlations:
Theory and Algorithm [73.94839250910977]
Noisy multi-label learning has garnered increasing attention due to the challenges posed by collecting large-scale accurate labels.
The introduction of transition matrices can help model multi-label noise and enable the development of statistically consistent algorithms.
We propose a novel estimator that leverages label correlations without the need for anchor points or precise fitting of noisy class posteriors.
arXiv Detail & Related papers (2023-09-22T08:35:38Z) - Provably End-to-end Label-Noise Learning without Anchor Points [118.97592870124937]
We propose an end-to-end framework for solving label-noise learning without anchor points.
Our proposed framework can identify the transition matrix if the clean class-posterior probabilities are sufficiently scattered.
arXiv Detail & Related papers (2021-02-04T03:59:37Z) - Extended T: Learning with Mixed Closed-set and Open-set Noisy Labels [86.5943044285146]
The label noise transition matrix $T$ reflects the probabilities that true labels flip into noisy ones.
In this paper, we focus on learning under the mixed closed-set and open-set label noise.
Our method can better model the mixed label noise, following its more robust performance than the prior state-of-the-art label-noise learning methods.
arXiv Detail & Related papers (2020-12-02T02:42:45Z) - Dual T: Reducing Estimation Error for Transition Matrix in Label-noise
Learning [157.2709657207203]
Existing methods for estimating the transition matrix rely heavily on estimating the noisy class posterior.
We introduce an intermediate class to avoid directly estimating the noisy class posterior.
By this intermediate class, the original transition matrix can then be factorized into the product of two easy-to-estimate transition matrices.
arXiv Detail & Related papers (2020-06-14T05:48:20Z) - Meta Transition Adaptation for Robust Deep Learning with Noisy Labels [61.8970957519509]
This study proposes a new meta-transition-learning strategy for the task.
Specifically, through the sound guidance of a small set of meta data with clean labels, the noise transition matrix and the classifier parameters can be mutually ameliorated.
Our method can more accurately extract the transition matrix, naturally following its more robust performance than prior arts.
arXiv Detail & Related papers (2020-06-10T07:27:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.