Enhanced Meta Label Correction for Coping with Label Corruption
- URL: http://arxiv.org/abs/2305.12961v2
- Date: Tue, 17 Oct 2023 07:44:33 GMT
- Title: Enhanced Meta Label Correction for Coping with Label Corruption
- Authors: Mitchell Keren Taraday, Chaim Baskin
- Abstract summary: We propose an Enhanced Meta Label Correction approach abbreviated asC for the learning with noisy labels problem.
TraditionalC outperforms prior approaches and achieves state-of-the-art results in all standard benchmarks.
- Score: 3.6804038214708577
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Traditional methods for learning with the presence of noisy labels have
successfully handled datasets with artificially injected noise but still fall
short of adequately handling real-world noise. With the increasing use of
meta-learning in the diverse fields of machine learning, researchers leveraged
auxiliary small clean datasets to meta-correct the training labels.
Nonetheless, existing meta-label correction approaches are not fully exploiting
their potential. In this study, we propose an Enhanced Meta Label Correction
approach abbreviated as EMLC for the learning with noisy labels (LNL) problem.
We re-examine the meta-learning process and introduce faster and more accurate
meta-gradient derivations. We propose a novel teacher architecture tailored
explicitly to the LNL problem, equipped with novel training objectives. EMLC
outperforms prior approaches and achieves state-of-the-art results in all
standard benchmarks. Notably, EMLC enhances the previous art on the noisy
real-world dataset Clothing1M by $1.52\%$ while requiring $\times 0.5$ the time
per epoch and with much faster convergence of the meta-objective when compared
to the baseline approach.
Related papers
- BatMan-CLR: Making Few-shots Meta-Learners Resilient Against Label Noise [5.67944073225624]
We present the first analysis of the impact of varying levels of label noise on the performance of state-of-the-art meta-learners.
We show that the accuracy of Reptile, iMAML, and foMAML drops by up to 42% on the Omniglot and CifarFS datasets when meta-training is affected by label noise.
We propose two sampling techniques, namely manifold (Man) and batch manifold (BatMan), which transform the noisy supervised learners into semi-supervised ones.
arXiv Detail & Related papers (2023-09-12T08:30:35Z) - DAC-MR: Data Augmentation Consistency Based Meta-Regularization for
Meta-Learning [55.733193075728096]
We propose a meta-knowledge informed meta-learning (MKIML) framework to improve meta-learning.
We preliminarily integrate meta-knowledge into meta-objective via using an appropriate meta-regularization (MR) objective.
The proposed DAC-MR is hopeful to learn well-performing meta-models from training tasks with noisy, sparse or unavailable meta-data.
arXiv Detail & Related papers (2023-05-13T11:01:47Z) - Faster Meta Update Strategy for Noise-Robust Deep Learning [62.08964100618873]
We introduce a novel Faster Meta Update Strategy (FaMUS) to replace the most expensive step in the meta gradient with a faster layer-wise approximation.
We show our method is able to save two-thirds of the training time while still maintaining the comparable or achieving even better generalization performance.
arXiv Detail & Related papers (2021-04-30T16:19:07Z) - DAGA: Data Augmentation with a Generation Approach for Low-resource
Tagging Tasks [88.62288327934499]
We propose a novel augmentation method with language models trained on the linearized labeled sentences.
Our method is applicable to both supervised and semi-supervised settings.
arXiv Detail & Related papers (2020-11-03T07:49:15Z) - Learning to Purify Noisy Labels via Meta Soft Label Corrector [49.92310583232323]
Recent deep neural networks (DNNs) can easily overfit to biased training data with noisy labels.
Label correction strategy is commonly used to alleviate this issue.
We propose a meta-learning model which could estimate soft labels through meta-gradient descent step.
arXiv Detail & Related papers (2020-08-03T03:25:17Z) - Meta Soft Label Generation for Noisy Labels [0.0]
We propose a Meta Soft Label Generation algorithm called MSLG.
MSLG can jointly generate soft labels using meta-learning techniques.
Our approach outperforms other state-of-the-art methods by a large margin.
arXiv Detail & Related papers (2020-07-11T19:37:44Z) - Semi-Supervised Learning with Meta-Gradient [123.26748223837802]
We propose a simple yet effective meta-learning algorithm in semi-supervised learning.
We find that the proposed algorithm performs favorably against state-of-the-art methods.
arXiv Detail & Related papers (2020-07-08T08:48:56Z) - Meta-Semi: A Meta-learning Approach for Semi-supervised Learning [43.218180383591196]
We propose a novel meta-learning based SSL algorithm (Meta-Semi)
We show theoretically that Meta-Semi converges to the stationary point of the loss function on labeled data under mild conditions.
Empirically, Meta-Semi outperforms state-of-the-art SSL algorithms significantly on the challenging semi-supervised CIFAR-100 and STL-10 tasks.
arXiv Detail & Related papers (2020-07-05T17:31:14Z) - Meta Transition Adaptation for Robust Deep Learning with Noisy Labels [61.8970957519509]
This study proposes a new meta-transition-learning strategy for the task.
Specifically, through the sound guidance of a small set of meta data with clean labels, the noise transition matrix and the classifier parameters can be mutually ameliorated.
Our method can more accurately extract the transition matrix, naturally following its more robust performance than prior arts.
arXiv Detail & Related papers (2020-06-10T07:27:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.