Estimating Instance-dependent Label-noise Transition Matrix using DNNs
- URL: http://arxiv.org/abs/2105.13001v1
- Date: Thu, 27 May 2021 08:36:54 GMT
- Title: Estimating Instance-dependent Label-noise Transition Matrix using DNNs
- Authors: Shuo Yang, Erkun Yang, Bo Han, Yang Liu, Min Xu, Gang Niu, Tongliang
Liu
- Abstract summary: In label-noise learning, estimating the transition matrix is a hot topic.
In this paper, we study to directly model the transition from Bayes optimal distribution to noisy distribution.
By exploiting the advantages, we estimate the Bayes label transition matrix by employing a deep neural network in a parameterized way.
- Score: 66.29979882301265
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In label-noise learning, estimating the transition matrix is a hot topic as
the matrix plays an important role in building statistically consistent
classifiers. Traditionally, the transition from clean distribution to noisy
distribution (i.e., clean label transition matrix) has been widely exploited to
learn a clean label classifier by employing the noisy data. Motivated by that
classifiers mostly output Bayes optimal labels for prediction, in this paper,
we study to directly model the transition from Bayes optimal distribution to
noisy distribution (i.e., Bayes label transition matrix) and learn a Bayes
optimal label classifier. Note that given only noisy data, it is ill-posed to
estimate either the clean label transition matrix or the Bayes label transition
matrix. But favorably, Bayes optimal labels are less uncertain compared with
the clean labels, i.e., the class posteriors of Bayes optimal labels are
one-hot vectors while those of clean labels are not. This enables two
advantages to estimate the Bayes label transition matrix, i.e., (a) we could
theoretically recover a set of Bayes optimal labels under mild conditions; (b)
the feasible solution space is much smaller. By exploiting the advantages, we
estimate the Bayes label transition matrix by employing a deep neural network
in a parameterized way, leading to better generalization and superior
classification performance.
Related papers
- Complementary to Multiple Labels: A Correlation-Aware Correction
Approach [65.59584909436259]
We show theoretically how the estimated transition matrix in multi-class CLL could be distorted in multi-labeled cases.
We propose a two-step method to estimate the transition matrix from candidate labels.
arXiv Detail & Related papers (2023-02-25T04:48:48Z) - Multi-label Classification with High-rank and High-order Label
Correlations [62.39748565407201]
Previous methods capture the high-order label correlations mainly by transforming the label matrix to a latent label space with low-rank matrix factorization.
We propose a simple yet effective method to depict the high-order label correlations explicitly, and at the same time maintain the high-rank of the label matrix.
Comparative studies over twelve benchmark data sets validate the effectiveness of the proposed algorithm in multi-label classification.
arXiv Detail & Related papers (2022-07-09T05:15:31Z) - Provably End-to-end Label-Noise Learning without Anchor Points [118.97592870124937]
We propose an end-to-end framework for solving label-noise learning without anchor points.
Our proposed framework can identify the transition matrix if the clean class-posterior probabilities are sufficiently scattered.
arXiv Detail & Related papers (2021-02-04T03:59:37Z) - Extended T: Learning with Mixed Closed-set and Open-set Noisy Labels [86.5943044285146]
The label noise transition matrix $T$ reflects the probabilities that true labels flip into noisy ones.
In this paper, we focus on learning under the mixed closed-set and open-set label noise.
Our method can better model the mixed label noise, following its more robust performance than the prior state-of-the-art label-noise learning methods.
arXiv Detail & Related papers (2020-12-02T02:42:45Z) - Error-Bounded Correction of Noisy Labels [17.510654621245656]
We show that the prediction of a noisy classifier can indeed be a good indicator of whether the label of a training data is clean.
Based on the theoretical result, we propose a novel algorithm that corrects the labels based on the noisy classifier prediction.
We incorporate our label correction algorithm into the training of deep neural networks and train models that achieve superior testing performance on multiple public datasets.
arXiv Detail & Related papers (2020-11-19T19:23:23Z) - Dual T: Reducing Estimation Error for Transition Matrix in Label-noise
Learning [157.2709657207203]
Existing methods for estimating the transition matrix rely heavily on estimating the noisy class posterior.
We introduce an intermediate class to avoid directly estimating the noisy class posterior.
By this intermediate class, the original transition matrix can then be factorized into the product of two easy-to-estimate transition matrices.
arXiv Detail & Related papers (2020-06-14T05:48:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.