Confidence Scores Make Instance-dependent Label-noise Learning Possible
- URL: http://arxiv.org/abs/2001.03772v2
- Date: Mon, 22 Feb 2021 23:40:07 GMT
- Title: Confidence Scores Make Instance-dependent Label-noise Learning Possible
- Authors: Antonin Berthon and Bo Han and Gang Niu and Tongliang Liu and Masashi
Sugiyama
- Abstract summary: In learning with noisy labels, for every instance, its label can randomly walk to other classes following a transition distribution which is named a noise model.
We introduce confidence-scored instance-dependent noise (CSIDN), where each instance-label pair is equipped with a confidence score.
We find with the help of confidence scores, the transition distribution of each instance can be approximately estimated.
- Score: 129.84497190791103
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In learning with noisy labels, for every instance, its label can randomly
walk to other classes following a transition distribution which is named a
noise model. Well-studied noise models are all instance-independent, namely,
the transition depends only on the original label but not the instance itself,
and thus they are less practical in the wild. Fortunately, methods based on
instance-dependent noise have been studied, but most of them have to rely on
strong assumptions on the noise models. To alleviate this issue, we introduce
confidence-scored instance-dependent noise (CSIDN), where each instance-label
pair is equipped with a confidence score. We find with the help of confidence
scores, the transition distribution of each instance can be approximately
estimated. Similarly to the powerful forward correction for
instance-independent noise, we propose a novel instance-level forward
correction for CSIDN. We demonstrate the utility and effectiveness of our
method through multiple experiments under synthetic label noise and real-world
unknown noise.
Related papers
- InstanT: Semi-supervised Learning with Instance-dependent Thresholds [75.91684890150283]
We propose the study of instance-dependent thresholds, which has the highest degree of freedom compared with existing methods.
We devise a novel instance-dependent threshold function for all unlabeled instances by utilizing their instance-level ambiguity and the instance-dependent error rates of pseudo-labels.
arXiv Detail & Related papers (2023-10-29T05:31:43Z) - Instance-dependent Label Distribution Estimation for Learning with Label Noise [20.479674500893303]
Noise transition matrix (NTM) estimation is a promising approach for learning with label noise.
We propose an Instance-dependent Label Distribution Estimation (ILDE) method to learn from noisy labels for image classification.
Our results indicate that the proposed ILDE method outperforms all competing methods, no matter whether the noise is synthetic or real noise.
arXiv Detail & Related papers (2022-12-16T10:13:25Z) - Identifying Hard Noise in Long-Tailed Sample Distribution [76.16113794808001]
We introduce Noisy Long-Tailed Classification (NLT)
Most de-noising methods fail to identify the hard noises.
We design an iterative noisy learning framework called Hard-to-Easy (H2E)
arXiv Detail & Related papers (2022-07-27T09:03:03Z) - Denoising Distantly Supervised Named Entity Recognition via a
Hypergeometric Probabilistic Model [26.76830553508229]
Hypergeometric Learning (HGL) is a denoising algorithm for distantly supervised named entity recognition.
HGL takes both noise distribution and instance-level confidence into consideration.
Experiments show that HGL can effectively denoise the weakly-labeled data retrieved from distant supervision.
arXiv Detail & Related papers (2021-06-17T04:01:25Z) - Approximating Instance-Dependent Noise via Instance-Confidence Embedding [87.65718705642819]
Label noise in multiclass classification is a major obstacle to the deployment of learning systems.
We investigate the instance-dependent noise (IDN) model and propose an efficient approximation of IDN to capture the instance-specific label corruption.
arXiv Detail & Related papers (2021-03-25T02:33:30Z) - Tackling Instance-Dependent Label Noise via a Universal Probabilistic
Model [80.91927573604438]
This paper proposes a simple yet universal probabilistic model, which explicitly relates noisy labels to their instances.
Experiments on datasets with both synthetic and real-world label noise verify that the proposed method yields significant improvements on robustness.
arXiv Detail & Related papers (2021-01-14T05:43:51Z) - A Second-Order Approach to Learning with Instance-Dependent Label Noise [58.555527517928596]
The presence of label noise often misleads the training of deep neural networks.
We show that the errors in human-annotated labels are more likely to be dependent on the difficulty levels of tasks.
arXiv Detail & Related papers (2020-12-22T06:36:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.