Provably End-to-end Label-Noise Learning without Anchor Points
- URL: http://arxiv.org/abs/2102.02400v1
- Date: Thu, 4 Feb 2021 03:59:37 GMT
- Title: Provably End-to-end Label-Noise Learning without Anchor Points
- Authors: Xuefeng Li, Tongliang Liu, Bo Han, Gang Niu, Masashi Sugiyama
- Abstract summary: We propose an end-to-end framework for solving label-noise learning without anchor points.
Our proposed framework can identify the transition matrix if the clean class-posterior probabilities are sufficiently scattered.
- Score: 118.97592870124937
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In label-noise learning, the transition matrix plays a key role in building
statistically consistent classifiers. Existing consistent estimators for the
transition matrix have been developed by exploiting anchor points. However, the
anchor-point assumption is not always satisfied in real scenarios. In this
paper, we propose an end-to-end framework for solving label-noise learning
without anchor points, in which we simultaneously minimize two objectives: the
discrepancy between the distribution learned by the neural network and the
noisy class-posterior distribution, and the volume of the simplex formed by the
columns of the transition matrix. Our proposed framework can identify the
transition matrix if the clean class-posterior probabilities are sufficiently
scattered. This is by far the mildest assumption under which the transition
matrix is provably identifiable and the learned classifier is statistically
consistent. Experimental results on benchmark datasets demonstrate the
effectiveness and robustness of the proposed method.
Related papers
- Dirichlet-based Per-Sample Weighting by Transition Matrix for Noisy
Label Learning [18.688802981093644]
We propose a new utilization method based on resampling, coined RENT.
Rent consistently outperforms existing transition matrix utilization methods, which includes reweighting, on various benchmark datasets.
arXiv Detail & Related papers (2024-03-05T06:20:49Z) - Multi-Label Noise Transition Matrix Estimation with Label Correlations:
Theory and Algorithm [73.94839250910977]
Noisy multi-label learning has garnered increasing attention due to the challenges posed by collecting large-scale accurate labels.
The introduction of transition matrices can help model multi-label noise and enable the development of statistically consistent algorithms.
We propose a novel estimator that leverages label correlations without the need for anchor points or precise fitting of noisy class posteriors.
arXiv Detail & Related papers (2023-09-22T08:35:38Z) - Learning from Multiple Annotators by Incorporating Instance Features [15.643325526074804]
Learning from multiple annotators aims to induce a high-quality classifier from training instances.
Most existing methods adopt class-level confusion matrices of annotators that observed labels do not depend on the instance features.
We propose a noise transition matrix, which incorporates the influence of instance features on annotators' performance based on confusion matrices.
arXiv Detail & Related papers (2021-06-29T08:07:24Z) - Learning Noise Transition Matrix from Only Noisy Labels via Total
Variation Regularization [88.91872713134342]
We propose a theoretically grounded method that can estimate the noise transition matrix and learn a classifier simultaneously.
We show the effectiveness of the proposed method through experiments on benchmark and real-world datasets.
arXiv Detail & Related papers (2021-02-04T05:09:18Z) - Dual T: Reducing Estimation Error for Transition Matrix in Label-noise
Learning [157.2709657207203]
Existing methods for estimating the transition matrix rely heavily on estimating the noisy class posterior.
We introduce an intermediate class to avoid directly estimating the noisy class posterior.
By this intermediate class, the original transition matrix can then be factorized into the product of two easy-to-estimate transition matrices.
arXiv Detail & Related papers (2020-06-14T05:48:20Z) - Meta Transition Adaptation for Robust Deep Learning with Noisy Labels [61.8970957519509]
This study proposes a new meta-transition-learning strategy for the task.
Specifically, through the sound guidance of a small set of meta data with clean labels, the noise transition matrix and the classifier parameters can be mutually ameliorated.
Our method can more accurately extract the transition matrix, naturally following its more robust performance than prior arts.
arXiv Detail & Related papers (2020-06-10T07:27:25Z) - Matrix Smoothing: A Regularization for DNN with Transition Matrix under
Noisy Labels [54.585681272543056]
Training deep neural networks (DNNs) in the presence of noisy labels is an important and challenging task.
Recent probabilistic methods directly apply transition matrix to DNN, neglect DNN's susceptibility to overfitting.
We propose a novel method, in which a smoothed transition matrix is used for updating DNN, to restrict the overfitting.
arXiv Detail & Related papers (2020-03-26T13:49:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.