Instance-Dependent Label-Noise Learning with Manifold-Regularized
Transition Matrix Estimation
- URL: http://arxiv.org/abs/2206.02791v1
- Date: Mon, 6 Jun 2022 04:12:01 GMT
- Title: Instance-Dependent Label-Noise Learning with Manifold-Regularized
Transition Matrix Estimation
- Authors: De Cheng, Tongliang Liu, Yixiong Ning, Nannan Wang, Bo Han, Gang Niu,
Xinbo Gao, Masashi Sugiyama
- Abstract summary: The transition matrix T(x) is unidentifiable under the instance-dependent noise(IDN)
We propose assumption on the geometry of T(x) that "the closer two instances are, the more similar their corresponding transition matrices should be"
Our method is superior to state-of-the-art approaches for label-noise learning under the challenging IDN.
- Score: 172.81824511381984
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In label-noise learning, estimating the transition matrix has attracted more
and more attention as the matrix plays an important role in building
statistically consistent classifiers. However, it is very challenging to
estimate the transition matrix T(x), where x denotes the instance, because it
is unidentifiable under the instance-dependent noise(IDN). To address this
problem, we have noticed that, there are psychological and physiological
evidences showing that we humans are more likely to annotate instances of
similar appearances to the same classes, and thus poor-quality or ambiguous
instances of similar appearances are easier to be mislabeled to the correlated
or same noisy classes. Therefore, we propose assumption on the geometry of T(x)
that "the closer two instances are, the more similar their corresponding
transition matrices should be". More specifically, we formulate above
assumption into the manifold embedding, to effectively reduce the degree of
freedom of T(x) and make it stably estimable in practice. The proposed
manifold-regularized technique works by directly reducing the estimation error
without hurting the approximation error about the estimation problem of T(x).
Experimental evaluations on four synthetic and two real-world datasets
demonstrate that our method is superior to state-of-the-art approaches for
label-noise learning under the challenging IDN.
Related papers
- Multi-Label Noise Transition Matrix Estimation with Label Correlations:
Theory and Algorithm [73.94839250910977]
Noisy multi-label learning has garnered increasing attention due to the challenges posed by collecting large-scale accurate labels.
The introduction of transition matrices can help model multi-label noise and enable the development of statistically consistent algorithms.
We propose a novel estimator that leverages label correlations without the need for anchor points or precise fitting of noisy class posteriors.
arXiv Detail & Related papers (2023-09-22T08:35:38Z) - Learning Noise Transition Matrix from Only Noisy Labels via Total
Variation Regularization [88.91872713134342]
We propose a theoretically grounded method that can estimate the noise transition matrix and learn a classifier simultaneously.
We show the effectiveness of the proposed method through experiments on benchmark and real-world datasets.
arXiv Detail & Related papers (2021-02-04T05:09:18Z) - Provably End-to-end Label-Noise Learning without Anchor Points [118.97592870124937]
We propose an end-to-end framework for solving label-noise learning without anchor points.
Our proposed framework can identify the transition matrix if the clean class-posterior probabilities are sufficiently scattered.
arXiv Detail & Related papers (2021-02-04T03:59:37Z) - Dual T: Reducing Estimation Error for Transition Matrix in Label-noise
Learning [157.2709657207203]
Existing methods for estimating the transition matrix rely heavily on estimating the noisy class posterior.
We introduce an intermediate class to avoid directly estimating the noisy class posterior.
By this intermediate class, the original transition matrix can then be factorized into the product of two easy-to-estimate transition matrices.
arXiv Detail & Related papers (2020-06-14T05:48:20Z) - Meta Transition Adaptation for Robust Deep Learning with Noisy Labels [61.8970957519509]
This study proposes a new meta-transition-learning strategy for the task.
Specifically, through the sound guidance of a small set of meta data with clean labels, the noise transition matrix and the classifier parameters can be mutually ameliorated.
Our method can more accurately extract the transition matrix, naturally following its more robust performance than prior arts.
arXiv Detail & Related papers (2020-06-10T07:27:25Z) - A Practical Framework for Relation Extraction with Noisy Labels Based on
Doubly Transitional Loss [14.121872633596452]
We introduce a practical end-to-end deep learning framework for automatic labeling.
One transition is parameterized by a non-linear transformation between hidden layers.
Another is an explicit probability transition matrix that captures the direct conversion between labels.
arXiv Detail & Related papers (2020-04-28T19:38:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.