Part-dependent Label Noise: Towards Instance-dependent Label Noise
- URL: http://arxiv.org/abs/2006.07836v2
- Date: Thu, 3 Dec 2020 02:21:51 GMT
- Title: Part-dependent Label Noise: Towards Instance-dependent Label Noise
- Authors: Xiaobo Xia, Tongliang Liu, Bo Han, Nannan Wang, Mingming Gong, Haifeng
Liu, Gang Niu, Dacheng Tao, Masashi Sugiyama
- Abstract summary: Learning with the textitinstance-dependent label noise is challenging, because it is hard to model such real-world noise.
In this paper, we approximate the instance-dependent label noise by exploiting textitpart-dependent label noise.
Empirical evaluations on synthetic and real-world datasets demonstrate our method is superior to the state-of-the-art approaches.
- Score: 194.73829226122731
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning with the \textit{instance-dependent} label noise is challenging,
because it is hard to model such real-world noise. Note that there are
psychological and physiological evidences showing that we humans perceive
instances by decomposing them into parts. Annotators are therefore more likely
to annotate instances based on the parts rather than the whole instances, where
a wrong mapping from parts to classes may cause the instance-dependent label
noise. Motivated by this human cognition, in this paper, we approximate the
instance-dependent label noise by exploiting \textit{part-dependent} label
noise. Specifically, since instances can be approximately reconstructed by a
combination of parts, we approximate the instance-dependent \textit{transition
matrix} for an instance by a combination of the transition matrices for the
parts of the instance. The transition matrices for parts can be learned by
exploiting anchor points (i.e., data points that belong to a specific class
almost surely). Empirical evaluations on synthetic and real-world datasets
demonstrate our method is superior to the state-of-the-art approaches for
learning from the instance-dependent label noise.
Related papers
- Learning Causal Transition Matrix for Instance-dependent Label Noise [40.634344530749324]
We study the data generation process of noisy labels from a causal perspective.
An unobservable latent variable can affect either the instance itself, the label annotation procedure, or both.
We have designed a novel training framework that explicitly models this causal relationship.
arXiv Detail & Related papers (2024-12-18T05:33:16Z) - Estimating Noisy Class Posterior with Part-level Labels for Noisy Label Learning [13.502549812291878]
Existing methods typically learn noisy class posteriors by training a classification model with noisy labels.
This paper proposes to augment the supervised information with part-level labels, encouraging the model to focus on and integrate richer information from various parts.
Our method is theoretically sound, while experiments show that it is empirically effective in synthetic and real-world noisy benchmarks.
arXiv Detail & Related papers (2024-05-08T12:13:40Z) - Rethinking the Value of Labels for Instance-Dependent Label Noise
Learning [43.481591776038144]
noisy labels in real-world applications often depend on both the true label and the features.
In this work, we tackle instance-dependent label noise with a novel deep generative model that avoids explicitly modeling the noise transition matrix.
Our algorithm leverages casual representation learning and simultaneously identifies the high-level content and style latent factors from the data.
arXiv Detail & Related papers (2023-05-10T15:29:07Z) - Instance-dependent Label-noise Learning under a Structural Causal Model [92.76400590283448]
Label noise will degenerate the performance of deep learning algorithms.
By leveraging a structural causal model, we propose a novel generative approach for instance-dependent label-noise learning.
arXiv Detail & Related papers (2021-09-07T10:42:54Z) - Approximating Instance-Dependent Noise via Instance-Confidence Embedding [87.65718705642819]
Label noise in multiclass classification is a major obstacle to the deployment of learning systems.
We investigate the instance-dependent noise (IDN) model and propose an efficient approximation of IDN to capture the instance-specific label corruption.
arXiv Detail & Related papers (2021-03-25T02:33:30Z) - Class2Simi: A Noise Reduction Perspective on Learning with Noisy Labels [98.13491369929798]
We propose a framework called Class2Simi, which transforms data points with noisy class labels to data pairs with noisy similarity labels.
Class2Simi is computationally efficient because not only this transformation is on-the-fly in mini-batches, but also it just changes loss on top of model prediction into a pairwise manner.
arXiv Detail & Related papers (2020-06-14T07:55:32Z) - Multi-Class Classification from Noisy-Similarity-Labeled Data [98.13491369929798]
We propose a method for learning from only noisy-similarity-labeled data.
We use a noise transition matrix to bridge the class-posterior probability between clean and noisy data.
We build a novel learning system which can assign noise-free class labels for instances.
arXiv Detail & Related papers (2020-02-16T05:10:21Z) - Confidence Scores Make Instance-dependent Label-noise Learning Possible [129.84497190791103]
In learning with noisy labels, for every instance, its label can randomly walk to other classes following a transition distribution which is named a noise model.
We introduce confidence-scored instance-dependent noise (CSIDN), where each instance-label pair is equipped with a confidence score.
We find with the help of confidence scores, the transition distribution of each instance can be approximately estimated.
arXiv Detail & Related papers (2020-01-11T16:15:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.