Dual T: Reducing Estimation Error for Transition Matrix in Label-noise
Learning
- URL: http://arxiv.org/abs/2006.07805v3
- Date: Wed, 23 Jun 2021 05:08:54 GMT
- Title: Dual T: Reducing Estimation Error for Transition Matrix in Label-noise
Learning
- Authors: Yu Yao, Tongliang Liu, Bo Han, Mingming Gong, Jiankang Deng, Gang Niu,
Masashi Sugiyama
- Abstract summary: Existing methods for estimating the transition matrix rely heavily on estimating the noisy class posterior.
We introduce an intermediate class to avoid directly estimating the noisy class posterior.
By this intermediate class, the original transition matrix can then be factorized into the product of two easy-to-estimate transition matrices.
- Score: 157.2709657207203
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The transition matrix, denoting the transition relationship from clean labels
to noisy labels, is essential to build statistically consistent classifiers in
label-noise learning. Existing methods for estimating the transition matrix
rely heavily on estimating the noisy class posterior. However, the estimation
error for noisy class posterior could be large due to the randomness of label
noise, which would lead the transition matrix to be poorly estimated.
Therefore, in this paper, we aim to solve this problem by exploiting the
divide-and-conquer paradigm. Specifically, we introduce an intermediate class
to avoid directly estimating the noisy class posterior. By this intermediate
class, the original transition matrix can then be factorized into the product
of two easy-to-estimate transition matrices. We term the proposed method the
dual-T estimator. Both theoretical analyses and empirical results illustrate
the effectiveness of the dual-T estimator for estimating transition matrices,
leading to better classification performances.
Related papers
- Multi-Label Noise Transition Matrix Estimation with Label Correlations:
Theory and Algorithm [73.94839250910977]
Noisy multi-label learning has garnered increasing attention due to the challenges posed by collecting large-scale accurate labels.
The introduction of transition matrices can help model multi-label noise and enable the development of statistically consistent algorithms.
We propose a novel estimator that leverages label correlations without the need for anchor points or precise fitting of noisy class posteriors.
arXiv Detail & Related papers (2023-09-22T08:35:38Z) - Instance-Dependent Label-Noise Learning with Manifold-Regularized
Transition Matrix Estimation [172.81824511381984]
The transition matrix T(x) is unidentifiable under the instance-dependent noise(IDN)
We propose assumption on the geometry of T(x) that "the closer two instances are, the more similar their corresponding transition matrices should be"
Our method is superior to state-of-the-art approaches for label-noise learning under the challenging IDN.
arXiv Detail & Related papers (2022-06-06T04:12:01Z) - Learning Noise Transition Matrix from Only Noisy Labels via Total
Variation Regularization [88.91872713134342]
We propose a theoretically grounded method that can estimate the noise transition matrix and learn a classifier simultaneously.
We show the effectiveness of the proposed method through experiments on benchmark and real-world datasets.
arXiv Detail & Related papers (2021-02-04T05:09:18Z) - Provably End-to-end Label-Noise Learning without Anchor Points [118.97592870124937]
We propose an end-to-end framework for solving label-noise learning without anchor points.
Our proposed framework can identify the transition matrix if the clean class-posterior probabilities are sufficiently scattered.
arXiv Detail & Related papers (2021-02-04T03:59:37Z) - Meta Transition Adaptation for Robust Deep Learning with Noisy Labels [61.8970957519509]
This study proposes a new meta-transition-learning strategy for the task.
Specifically, through the sound guidance of a small set of meta data with clean labels, the noise transition matrix and the classifier parameters can be mutually ameliorated.
Our method can more accurately extract the transition matrix, naturally following its more robust performance than prior arts.
arXiv Detail & Related papers (2020-06-10T07:27:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.